9396 1727204023.43871: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-twx executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 9396 1727204023.44405: Added group all to inventory 9396 1727204023.44408: Added group ungrouped to inventory 9396 1727204023.44413: Group all now contains ungrouped 9396 1727204023.44416: Examining possible inventory source: /tmp/network-6Zh/inventory-Sfc.yml 9396 1727204023.69617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 9396 1727204023.69801: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 9396 1727204023.69829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 9396 1727204023.69904: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 9396 1727204023.70033: Loaded config def from plugin (inventory/script) 9396 1727204023.70036: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 9396 1727204023.70088: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 9396 1727204023.70245: Loaded config def from plugin (inventory/yaml) 9396 1727204023.70248: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 9396 1727204023.70356: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 9396 1727204023.71085: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 9396 1727204023.71091: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 9396 1727204023.71096: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 9396 1727204023.71102: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 9396 1727204023.71107: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 9396 1727204023.71404: /tmp/network-6Zh/inventory-Sfc.yml was not parsable by auto 9396 1727204023.71482: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 9396 1727204023.71547: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 9396 1727204023.71671: group all already in inventory 9396 1727204023.71680: set inventory_file for managed-node1 9396 1727204023.71685: set inventory_dir for managed-node1 9396 1727204023.71687: Added host managed-node1 to inventory 9396 1727204023.71692: Added host managed-node1 to group all 9396 1727204023.71693: set ansible_host for managed-node1 9396 1727204023.71694: set ansible_ssh_extra_args for managed-node1 9396 1727204023.71699: set inventory_file for managed-node2 9396 1727204023.71703: set inventory_dir for managed-node2 9396 1727204023.71704: Added host managed-node2 to inventory 9396 1727204023.71706: Added host managed-node2 to group all 9396 1727204023.71707: set ansible_host for managed-node2 9396 1727204023.71708: set ansible_ssh_extra_args for managed-node2 9396 1727204023.71711: set inventory_file for managed-node3 9396 1727204023.71714: set inventory_dir for managed-node3 9396 1727204023.71715: Added host managed-node3 to inventory 9396 1727204023.71716: Added host managed-node3 to group all 9396 1727204023.71717: set ansible_host for managed-node3 9396 1727204023.71718: set ansible_ssh_extra_args for managed-node3 9396 1727204023.71722: Reconcile groups and hosts in inventory. 9396 1727204023.71727: Group ungrouped now contains managed-node1 9396 1727204023.71729: Group ungrouped now contains managed-node2 9396 1727204023.71731: Group ungrouped now contains managed-node3 9396 1727204023.71868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 9396 1727204023.72070: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 9396 1727204023.72136: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 9396 1727204023.72174: Loaded config def from plugin (vars/host_group_vars) 9396 1727204023.72177: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 9396 1727204023.72185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 9396 1727204023.72300: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 9396 1727204023.72353: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 9396 1727204023.72873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204023.73268: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 9396 1727204023.73367: Loaded config def from plugin (connection/local) 9396 1727204023.73371: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 9396 1727204023.74582: Loaded config def from plugin (connection/paramiko_ssh) 9396 1727204023.74586: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 9396 1727204023.75955: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 9396 1727204023.76015: Loaded config def from plugin (connection/psrp) 9396 1727204023.76019: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 9396 1727204023.77243: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 9396 1727204023.77331: Loaded config def from plugin (connection/ssh) 9396 1727204023.77335: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 9396 1727204023.80763: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 9396 1727204023.80860: Loaded config def from plugin (connection/winrm) 9396 1727204023.80865: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 9396 1727204023.80914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 9396 1727204023.81126: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 9396 1727204023.81295: Loaded config def from plugin (shell/cmd) 9396 1727204023.81298: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 9396 1727204023.81335: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 9396 1727204023.81524: Loaded config def from plugin (shell/powershell) 9396 1727204023.81527: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 9396 1727204023.81601: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 9396 1727204023.82070: Loaded config def from plugin (shell/sh) 9396 1727204023.82073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 9396 1727204023.82126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 9396 1727204023.82388: Loaded config def from plugin (become/runas) 9396 1727204023.82394: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 9396 1727204023.82729: Loaded config def from plugin (become/su) 9396 1727204023.82735: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 9396 1727204023.82971: Loaded config def from plugin (become/sudo) 9396 1727204023.82974: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 9396 1727204023.83028: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 9396 1727204023.83565: in VariableManager get_vars() 9396 1727204023.83650: done with get_vars() 9396 1727204023.83828: trying /usr/local/lib/python3.12/site-packages/ansible/modules 9396 1727204023.88720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 9396 1727204023.88886: in VariableManager get_vars() 9396 1727204023.88894: done with get_vars() 9396 1727204023.88898: variable 'playbook_dir' from source: magic vars 9396 1727204023.88899: variable 'ansible_playbook_python' from source: magic vars 9396 1727204023.88900: variable 'ansible_config_file' from source: magic vars 9396 1727204023.88901: variable 'groups' from source: magic vars 9396 1727204023.88902: variable 'omit' from source: magic vars 9396 1727204023.88903: variable 'ansible_version' from source: magic vars 9396 1727204023.88904: variable 'ansible_check_mode' from source: magic vars 9396 1727204023.88905: variable 'ansible_diff_mode' from source: magic vars 9396 1727204023.88906: variable 'ansible_forks' from source: magic vars 9396 1727204023.88907: variable 'ansible_inventory_sources' from source: magic vars 9396 1727204023.88910: variable 'ansible_skip_tags' from source: magic vars 9396 1727204023.88911: variable 'ansible_limit' from source: magic vars 9396 1727204023.88912: variable 'ansible_run_tags' from source: magic vars 9396 1727204023.88913: variable 'ansible_verbosity' from source: magic vars 9396 1727204023.88961: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml 9396 1727204023.90296: in VariableManager get_vars() 9396 1727204023.90313: done with get_vars() 9396 1727204023.90321: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 9396 1727204023.91419: in VariableManager get_vars() 9396 1727204023.91439: done with get_vars() 9396 1727204023.91447: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 9396 1727204023.91541: in VariableManager get_vars() 9396 1727204023.91577: done with get_vars() 9396 1727204023.91695: in VariableManager get_vars() 9396 1727204023.91706: done with get_vars() 9396 1727204023.91715: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 9396 1727204023.91775: in VariableManager get_vars() 9396 1727204023.91787: done with get_vars() 9396 1727204023.92122: in VariableManager get_vars() 9396 1727204023.92134: done with get_vars() 9396 1727204023.92138: variable 'omit' from source: magic vars 9396 1727204023.92152: variable 'omit' from source: magic vars 9396 1727204023.92179: in VariableManager get_vars() 9396 1727204023.92187: done with get_vars() 9396 1727204023.92232: in VariableManager get_vars() 9396 1727204023.92242: done with get_vars() 9396 1727204023.92271: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 9396 1727204023.92454: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 9396 1727204023.92562: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 9396 1727204023.93301: in VariableManager get_vars() 9396 1727204023.93325: done with get_vars() 9396 1727204023.93718: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 9396 1727204023.93836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 9396 1727204023.95153: in VariableManager get_vars() 9396 1727204023.95166: done with get_vars() 9396 1727204023.95175: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 9396 1727204023.95314: in VariableManager get_vars() 9396 1727204023.95329: done with get_vars() 9396 1727204023.95428: in VariableManager get_vars() 9396 1727204023.95442: done with get_vars() 9396 1727204023.95758: in VariableManager get_vars() 9396 1727204023.95772: done with get_vars() 9396 1727204023.95776: variable 'omit' from source: magic vars 9396 1727204023.95797: variable 'omit' from source: magic vars 9396 1727204023.95832: in VariableManager get_vars() 9396 1727204023.95843: done with get_vars() 9396 1727204023.95860: in VariableManager get_vars() 9396 1727204023.95871: done with get_vars() 9396 1727204023.95906: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 9396 1727204023.95996: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 9396 1727204023.97220: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 9396 1727204023.97710: in VariableManager get_vars() 9396 1727204023.97742: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 9396 1727204024.00148: in VariableManager get_vars() 9396 1727204024.00172: done with get_vars() 9396 1727204024.00182: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 9396 1727204024.00783: in VariableManager get_vars() 9396 1727204024.00810: done with get_vars() 9396 1727204024.00879: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 9396 1727204024.00898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 9396 1727204024.01160: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 9396 1727204024.01377: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 9396 1727204024.01380: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 9396 1727204024.01420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 9396 1727204024.01451: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 9396 1727204024.01647: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 9396 1727204024.01700: Loaded config def from plugin (callback/default) 9396 1727204024.01702: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 9396 1727204024.02671: Loaded config def from plugin (callback/junit) 9396 1727204024.02674: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 9396 1727204024.02716: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 9396 1727204024.02771: Loaded config def from plugin (callback/minimal) 9396 1727204024.02773: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 9396 1727204024.02812: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 9396 1727204024.02861: Loaded config def from plugin (callback/tree) 9396 1727204024.02863: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 9396 1727204024.02966: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 9396 1727204024.02968: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_deprecated_nm.yml ***************************************** 2 plays in /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml 9396 1727204024.02992: in VariableManager get_vars() 9396 1727204024.03005: done with get_vars() 9396 1727204024.03012: in VariableManager get_vars() 9396 1727204024.03018: done with get_vars() 9396 1727204024.03022: variable 'omit' from source: magic vars 9396 1727204024.03052: in VariableManager get_vars() 9396 1727204024.03063: done with get_vars() 9396 1727204024.03079: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond_deprecated.yml' with nm as provider] *** 9396 1727204024.03550: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 9396 1727204024.03614: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 9396 1727204024.03642: getting the remaining hosts for this loop 9396 1727204024.03644: done getting the remaining hosts for this loop 9396 1727204024.03646: getting the next task for host managed-node1 9396 1727204024.03649: done getting next task for host managed-node1 9396 1727204024.03651: ^ task is: TASK: Gathering Facts 9396 1727204024.03652: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204024.03654: getting variables 9396 1727204024.03656: in VariableManager get_vars() 9396 1727204024.03665: Calling all_inventory to load vars for managed-node1 9396 1727204024.03667: Calling groups_inventory to load vars for managed-node1 9396 1727204024.03670: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204024.03680: Calling all_plugins_play to load vars for managed-node1 9396 1727204024.03691: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204024.03694: Calling groups_plugins_play to load vars for managed-node1 9396 1727204024.03724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204024.03767: done with get_vars() 9396 1727204024.03773: done getting variables 9396 1727204024.03838: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 Tuesday 24 September 2024 14:53:44 -0400 (0:00:00.009) 0:00:00.009 ***** 9396 1727204024.03856: entering _queue_task() for managed-node1/gather_facts 9396 1727204024.03857: Creating lock for gather_facts 9396 1727204024.04165: worker is 1 (out of 1 available) 9396 1727204024.04180: exiting _queue_task() for managed-node1/gather_facts 9396 1727204024.04197: done queuing things up, now waiting for results queue to drain 9396 1727204024.04200: waiting for pending results... 9396 1727204024.04345: running TaskExecutor() for managed-node1/TASK: Gathering Facts 9396 1727204024.04406: in run() - task 12b410aa-8751-36c5-1f9e-0000000000cd 9396 1727204024.04421: variable 'ansible_search_path' from source: unknown 9396 1727204024.04456: calling self._execute() 9396 1727204024.04511: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204024.04520: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204024.04529: variable 'omit' from source: magic vars 9396 1727204024.04617: variable 'omit' from source: magic vars 9396 1727204024.04642: variable 'omit' from source: magic vars 9396 1727204024.04675: variable 'omit' from source: magic vars 9396 1727204024.04717: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204024.04785: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204024.04804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204024.04823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204024.04835: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204024.04862: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204024.04867: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204024.04870: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204024.04955: Set connection var ansible_timeout to 10 9396 1727204024.04961: Set connection var ansible_shell_executable to /bin/sh 9396 1727204024.04970: Set connection var ansible_pipelining to False 9396 1727204024.04976: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204024.04984: Set connection var ansible_connection to ssh 9396 1727204024.04993: Set connection var ansible_shell_type to sh 9396 1727204024.05018: variable 'ansible_shell_executable' from source: unknown 9396 1727204024.05022: variable 'ansible_connection' from source: unknown 9396 1727204024.05025: variable 'ansible_module_compression' from source: unknown 9396 1727204024.05028: variable 'ansible_shell_type' from source: unknown 9396 1727204024.05032: variable 'ansible_shell_executable' from source: unknown 9396 1727204024.05035: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204024.05041: variable 'ansible_pipelining' from source: unknown 9396 1727204024.05043: variable 'ansible_timeout' from source: unknown 9396 1727204024.05049: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204024.05213: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204024.05218: variable 'omit' from source: magic vars 9396 1727204024.05224: starting attempt loop 9396 1727204024.05227: running the handler 9396 1727204024.05241: variable 'ansible_facts' from source: unknown 9396 1727204024.05257: _low_level_execute_command(): starting 9396 1727204024.05265: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204024.05800: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204024.05812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204024.05827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204024.05877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204024.05881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204024.05944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204024.07769: stdout chunk (state=3): >>>/root <<< 9396 1727204024.07905: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204024.07965: stderr chunk (state=3): >>><<< 9396 1727204024.07969: stdout chunk (state=3): >>><<< 9396 1727204024.07992: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204024.08005: _low_level_execute_command(): starting 9396 1727204024.08014: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105 `" && echo ansible-tmp-1727204024.0799267-9565-159407349289105="` echo /root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105 `" ) && sleep 0' 9396 1727204024.08552: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204024.08555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204024.08558: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204024.08560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204024.08621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204024.08627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204024.08671: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204024.10762: stdout chunk (state=3): >>>ansible-tmp-1727204024.0799267-9565-159407349289105=/root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105 <<< 9396 1727204024.10872: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204024.10933: stderr chunk (state=3): >>><<< 9396 1727204024.10939: stdout chunk (state=3): >>><<< 9396 1727204024.10969: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204024.0799267-9565-159407349289105=/root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204024.11003: variable 'ansible_module_compression' from source: unknown 9396 1727204024.11052: ANSIBALLZ: Using generic lock for ansible.legacy.setup 9396 1727204024.11056: ANSIBALLZ: Acquiring lock 9396 1727204024.11059: ANSIBALLZ: Lock acquired: 139797141880704 9396 1727204024.11066: ANSIBALLZ: Creating module 9396 1727204024.41774: ANSIBALLZ: Writing module into payload 9396 1727204024.42138: ANSIBALLZ: Writing module 9396 1727204024.42142: ANSIBALLZ: Renaming module 9396 1727204024.42144: ANSIBALLZ: Done creating module 9396 1727204024.42146: variable 'ansible_facts' from source: unknown 9396 1727204024.42148: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204024.42150: _low_level_execute_command(): starting 9396 1727204024.42152: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 9396 1727204024.42683: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204024.42702: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204024.42719: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204024.42738: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204024.42757: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204024.42768: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204024.42783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204024.42812: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204024.42824: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 9396 1727204024.42903: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204024.42923: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204024.42939: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204024.42958: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204024.43194: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204024.45080: stdout chunk (state=3): >>>PLATFORM <<< 9396 1727204024.45297: stdout chunk (state=3): >>>Linux <<< 9396 1727204024.45301: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 <<< 9396 1727204024.45321: stdout chunk (state=3): >>>/usr/bin/python3 <<< 9396 1727204024.45327: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 9396 1727204024.45501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204024.45598: stderr chunk (state=3): >>><<< 9396 1727204024.45601: stdout chunk (state=3): >>><<< 9396 1727204024.45604: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204024.45612 [managed-node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 9396 1727204024.45616: _low_level_execute_command(): starting 9396 1727204024.45618: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 9396 1727204024.45696: Sending initial data 9396 1727204024.45700: Sent initial data (1181 bytes) 9396 1727204024.46068: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204024.46118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204024.46122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 9396 1727204024.46124: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204024.46131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204024.46205: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204024.46209: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204024.46260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204024.50177: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 9396 1727204024.50897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204024.50901: stdout chunk (state=3): >>><<< 9396 1727204024.50903: stderr chunk (state=3): >>><<< 9396 1727204024.50905: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204024.50996: variable 'ansible_facts' from source: unknown 9396 1727204024.51007: variable 'ansible_facts' from source: unknown 9396 1727204024.51030: variable 'ansible_module_compression' from source: unknown 9396 1727204024.51078: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 9396 1727204024.51129: variable 'ansible_facts' from source: unknown 9396 1727204024.51294: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105/AnsiballZ_setup.py 9396 1727204024.51582: Sending initial data 9396 1727204024.51586: Sent initial data (152 bytes) 9396 1727204024.52153: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204024.52208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204024.52268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204024.52288: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204024.52324: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204024.52405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204024.54301: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204024.54350: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204024.54400: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmp0qtows97 /root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105/AnsiballZ_setup.py <<< 9396 1727204024.54414: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105/AnsiballZ_setup.py" <<< 9396 1727204024.54440: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmp0qtows97" to remote "/root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105/AnsiballZ_setup.py" <<< 9396 1727204024.56896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204024.56900: stdout chunk (state=3): >>><<< 9396 1727204024.56902: stderr chunk (state=3): >>><<< 9396 1727204024.56904: done transferring module to remote 9396 1727204024.56906: _low_level_execute_command(): starting 9396 1727204024.56908: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105/ /root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105/AnsiballZ_setup.py && sleep 0' 9396 1727204024.57524: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204024.57546: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204024.57560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204024.57606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204024.57620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204024.57690: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204024.57722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204024.57743: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204024.57758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204024.57882: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204024.59979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204024.59984: stdout chunk (state=3): >>><<< 9396 1727204024.59986: stderr chunk (state=3): >>><<< 9396 1727204024.60072: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204024.60076: _low_level_execute_command(): starting 9396 1727204024.60078: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105/AnsiballZ_setup.py && sleep 0' 9396 1727204024.60697: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204024.60701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204024.60704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204024.60706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204024.60723: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204024.60732: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204024.60878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204024.60882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204024.60938: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204024.63305: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 9396 1727204024.63350: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 9396 1727204024.63424: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 9396 1727204024.63462: stdout chunk (state=3): >>>import 'posix' # <<< 9396 1727204024.63508: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 9396 1727204024.63533: stdout chunk (state=3): >>># installing zipimport hook import 'time' # <<< 9396 1727204024.63550: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook <<< 9396 1727204024.63664: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # <<< 9396 1727204024.63714: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552a0d44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552a0a3ad0> <<< 9396 1727204024.63780: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552a0d6a20> <<< 9396 1727204024.63878: stdout chunk (state=3): >>>import '_signal' # <<< 9396 1727204024.63882: stdout chunk (state=3): >>>import '_abc' # <<< 9396 1727204024.63924: stdout chunk (state=3): >>>import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 9396 1727204024.64024: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # import 'posixpath' # <<< 9396 1727204024.64105: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 9396 1727204024.64132: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 9396 1727204024.64229: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529ec50a0> <<< 9396 1727204024.64267: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529ec5fd0> <<< 9396 1727204024.64296: stdout chunk (state=3): >>>import 'site' # <<< 9396 1727204024.64393: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 9396 1727204024.64717: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 9396 1727204024.64751: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 9396 1727204024.64774: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204024.64806: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 9396 1727204024.64852: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 9396 1727204024.64922: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 9396 1727204024.64926: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f03e00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 9396 1727204024.64966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f03ec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 9396 1727204024.64995: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 9396 1727204024.65016: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 9396 1727204024.65247: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204024.65259: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f3b800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f3be90> import '_collections' # <<< 9396 1727204024.65280: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f1bad0> import '_functools' # <<< 9396 1727204024.65305: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f191f0> <<< 9396 1727204024.65436: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f00fb0> <<< 9396 1727204024.65493: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 9396 1727204024.65522: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 9396 1727204024.66023: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f5f710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f5e330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f1a1e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f02ea0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f90740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f00230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204024.66102: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529f90bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f90aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529f90e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529efed50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f91550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f91220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f92450> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 9396 1727204024.66139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529fac680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529faddc0> <<< 9396 1727204024.66193: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 9396 1727204024.66201: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 9396 1727204024.66248: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529faecc0> <<< 9396 1727204024.66372: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529faf320> <<< 9396 1727204024.66387: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529fae210> <<< 9396 1727204024.66425: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 9396 1727204024.66429: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 9396 1727204024.66431: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529fafda0> <<< 9396 1727204024.66434: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529faf4d0> <<< 9396 1727204024.66688: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f924b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 9396 1727204024.66694: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529cabd10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529cd4710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529cd4470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529cd4740> <<< 9396 1727204024.66697: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529cd4920> <<< 9396 1727204024.66722: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529ca9eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 9396 1727204024.66831: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 9396 1727204024.66862: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 9396 1727204024.66897: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529cd5f70> <<< 9396 1727204024.66922: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529cd4bf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f92ba0> <<< 9396 1727204024.66945: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 9396 1727204024.67012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204024.67042: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 9396 1727204024.67132: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 9396 1727204024.67222: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d022d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 9396 1727204024.67259: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d1a3f0> <<< 9396 1727204024.67284: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 9396 1727204024.67338: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 9396 1727204024.67497: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d571d0> <<< 9396 1727204024.67570: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 9396 1727204024.67592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 9396 1727204024.67659: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d79970> <<< 9396 1727204024.67816: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d572f0> <<< 9396 1727204024.67820: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d1b080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529ba02c0> <<< 9396 1727204024.67842: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d19430> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529cd6e70> <<< 9396 1727204024.68013: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 9396 1727204024.68029: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5529d19550> <<< 9396 1727204024.68214: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_fv74n_4b/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 9396 1727204024.68371: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.68412: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 9396 1727204024.68416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 9396 1727204024.68458: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 9396 1727204024.68539: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 9396 1727204024.68583: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529c06030> <<< 9396 1727204024.68598: stdout chunk (state=3): >>>import '_typing' # <<< 9396 1727204024.68805: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529bdcf20> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529ba3fb0> <<< 9396 1727204024.68856: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.68885: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 9396 1727204024.68916: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.70521: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.71981: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529bdfec0> <<< 9396 1727204024.72011: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529c39af0> <<< 9396 1727204024.72077: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529c39880> <<< 9396 1727204024.72115: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529c39190> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 9396 1727204024.72163: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529c395e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529c06cc0> import 'atexit' # <<< 9396 1727204024.72216: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529c3a8a0> <<< 9396 1727204024.72250: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529c3aae0> <<< 9396 1727204024.72254: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 9396 1727204024.72387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 9396 1727204024.72392: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529c3afc0> import 'pwd' # <<< 9396 1727204024.72423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 9396 1727204024.72515: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a9ce00> <<< 9396 1727204024.72520: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529a9ea20> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 9396 1727204024.72544: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 9396 1727204024.72568: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a9f3b0> <<< 9396 1727204024.72658: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa0590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 9396 1727204024.72793: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 9396 1727204024.72797: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 9396 1727204024.72803: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 9396 1727204024.72911: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa3020> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529aa3170> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa11f0> <<< 9396 1727204024.72915: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 9396 1727204024.72952: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 9396 1727204024.73054: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 9396 1727204024.73062: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa6ed0> <<< 9396 1727204024.73080: stdout chunk (state=3): >>>import '_tokenize' # <<< 9396 1727204024.73143: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa59a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa5700> <<< 9396 1727204024.73146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 9396 1727204024.73157: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 9396 1727204024.73239: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa7fb0> <<< 9396 1727204024.73277: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa17f0> <<< 9396 1727204024.73280: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204024.73336: stdout chunk (state=3): >>># extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529aeaf90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aeb140> <<< 9396 1727204024.73348: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 9396 1727204024.73361: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 9396 1727204024.73450: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529af0d10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529af0ad0> <<< 9396 1727204024.73453: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 9396 1727204024.73585: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 9396 1727204024.73638: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204024.73642: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529af32c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529af1400> <<< 9396 1727204024.73660: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 9396 1727204024.73760: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204024.73763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 9396 1727204024.73766: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 9396 1727204024.73814: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529afaae0> <<< 9396 1727204024.74014: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529af3470> <<< 9396 1727204024.74140: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529afbdd0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529afbbc0> <<< 9396 1727204024.74198: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529afbf20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aeb440> <<< 9396 1727204024.74224: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 9396 1727204024.74321: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204024.74345: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529afeb10> <<< 9396 1727204024.74576: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529affe90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529afd2b0> <<< 9396 1727204024.74667: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529afe660> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529afce60> # zipimport: zlib available # zipimport: zlib available <<< 9396 1727204024.74683: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 9396 1727204024.74725: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.74866: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 9396 1727204024.74897: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 9396 1727204024.75059: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.75211: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.75946: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.76610: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 9396 1727204024.76751: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204024.76755: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529988080> <<< 9396 1727204024.76851: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 9396 1727204024.76866: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55299890a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529b03530> <<< 9396 1727204024.77002: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 9396 1727204024.77020: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 9396 1727204024.77405: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 9396 1727204024.77409: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529989220> # zipimport: zlib available <<< 9396 1727204024.78118: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.78585: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.78672: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.78771: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 9396 1727204024.78788: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.78821: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.78861: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 9396 1727204024.78879: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.79012: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.79086: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 9396 1727204024.79120: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 9396 1727204024.79138: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 9396 1727204024.79171: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.79222: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 9396 1727204024.79569: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.79817: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 9396 1727204024.79900: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 9396 1727204024.79903: stdout chunk (state=3): >>>import '_ast' # <<< 9396 1727204024.80046: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552998b560> # zipimport: zlib available <<< 9396 1727204024.80105: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.80196: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 9396 1727204024.80235: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 9396 1727204024.80240: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 9396 1727204024.80258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 9396 1727204024.80522: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529991c40> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55299925a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552998a690> <<< 9396 1727204024.80547: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.80594: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.80636: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 9396 1727204024.80653: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.80698: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.80735: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.80802: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.80874: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 9396 1727204024.80924: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204024.81024: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529991400> <<< 9396 1727204024.81070: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529992840> <<< 9396 1727204024.81111: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 9396 1727204024.81125: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.81194: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.81259: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.81294: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.81519: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 9396 1727204024.81563: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a2ab10> <<< 9396 1727204024.81618: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552999c7d0> <<< 9396 1727204024.81733: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552999bc50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552999a6f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 9396 1727204024.81757: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 9396 1727204024.81781: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 9396 1727204024.81850: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 9396 1727204024.81881: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 9396 1727204024.81897: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.81962: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.82030: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.82072: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.82080: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.82122: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.82167: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.82210: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.82248: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 9396 1727204024.82263: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.82382: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.82513: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.82529: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available <<< 9396 1727204024.82698: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.82894: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.82936: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.83001: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204024.83037: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 9396 1727204024.83056: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 9396 1727204024.83086: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 9396 1727204024.83116: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 9396 1727204024.83142: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a31100> <<< 9396 1727204024.83155: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 9396 1727204024.83185: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 9396 1727204024.83233: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 9396 1727204024.83266: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 9396 1727204024.83270: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 9396 1727204024.83287: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528f4c140> <<< 9396 1727204024.83327: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204024.83330: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204024.83354: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528f4c440> <<< 9396 1727204024.83388: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552997d1f0> <<< 9396 1727204024.83413: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552997c7a0> <<< 9396 1727204024.83444: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a333b0> <<< 9396 1727204024.83479: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a32ab0> <<< 9396 1727204024.83482: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 9396 1727204024.83548: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 9396 1727204024.83551: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 9396 1727204024.83592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 9396 1727204024.83622: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 9396 1727204024.83642: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528f4f380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528f4ec30> <<< 9396 1727204024.83673: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528f4ee10> <<< 9396 1727204024.83712: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528f4e090> <<< 9396 1727204024.83716: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 9396 1727204024.83846: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 9396 1727204024.83871: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528f4f4d0> <<< 9396 1727204024.83888: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 9396 1727204024.83920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 9396 1727204024.83949: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528fba000> <<< 9396 1727204024.84019: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528f4ffe0> <<< 9396 1727204024.84023: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a32840> import 'ansible.module_utils.facts.timeout' # <<< 9396 1727204024.84071: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 9396 1727204024.84074: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 9396 1727204024.84098: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84159: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84213: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 9396 1727204024.84235: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84287: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84361: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 9396 1727204024.84364: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84394: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 9396 1727204024.84425: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84455: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 9396 1727204024.84471: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84524: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84577: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 9396 1727204024.84592: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84627: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84676: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 9396 1727204024.84688: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84747: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84814: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84872: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.84941: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 9396 1727204024.84954: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 9396 1727204024.85520: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 9396 1727204024.86057: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86111: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86168: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86208: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86248: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 9396 1727204024.86265: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86294: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86324: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 9396 1727204024.86337: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86398: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86456: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 9396 1727204024.86474: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86512: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86538: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 9396 1727204024.86551: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86575: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86621: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 9396 1727204024.86716: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.86828: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 9396 1727204024.86832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 9396 1727204024.86870: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528fbb3b0> <<< 9396 1727204024.86874: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 9396 1727204024.86901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 9396 1727204024.87031: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528fbac60> import 'ansible.module_utils.facts.system.local' # <<< 9396 1727204024.87052: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.87118: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.87197: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 9396 1727204024.87310: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.87423: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 9396 1727204024.87427: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.87718: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 9396 1727204024.87803: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204024.87870: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204024.87892: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528fe62a0> <<< 9396 1727204024.88088: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528fd3140> <<< 9396 1727204024.88116: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 9396 1727204024.88168: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.88230: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 9396 1727204024.88300: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.88333: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.88424: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.88551: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.88713: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # <<< 9396 1727204024.88735: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available <<< 9396 1727204024.88784: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.88823: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 9396 1727204024.88872: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.88929: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 9396 1727204024.88965: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204024.89016: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528e01df0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e01a60> import 'ansible.module_utils.facts.system.user' # <<< 9396 1727204024.89044: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 9396 1727204024.89062: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.89101: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.89155: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 9396 1727204024.89158: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.89328: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.89504: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available <<< 9396 1727204024.89622: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.89729: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.89838: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.89901: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 9396 1727204024.89958: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 9396 1727204024.90137: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.90268: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 9396 1727204024.90362: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.90379: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.90596: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 9396 1727204024.90622: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 9396 1727204024.91233: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.91819: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 9396 1727204024.91827: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.91929: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.92052: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 9396 1727204024.92069: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.92166: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.92274: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 9396 1727204024.92470: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.92473: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.92638: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 9396 1727204024.92717: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available <<< 9396 1727204024.92732: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.92776: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 9396 1727204024.92796: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.92898: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.93002: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.93243: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.93469: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 9396 1727204024.93495: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 9396 1727204024.93529: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.93566: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 9396 1727204024.93587: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.93637: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.93812: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 9396 1727204024.93837: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available <<< 9396 1727204024.93864: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 9396 1727204024.93928: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.94016: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 9396 1727204024.94061: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.94122: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 9396 1727204024.94137: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.94436: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.94738: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 9396 1727204024.94758: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.95015: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 9396 1727204024.95020: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.95058: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available <<< 9396 1727204024.95112: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.95127: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 9396 1727204024.95275: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.95311: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 9396 1727204024.95510: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 9396 1727204024.95534: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.95553: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.95808: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 9396 1727204024.95911: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 9396 1727204024.96126: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.96343: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 9396 1727204024.96408: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.96422: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.96450: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 9396 1727204024.96467: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.96707: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available <<< 9396 1727204024.96769: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 9396 1727204024.96772: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.96866: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204024.96961: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 9396 1727204024.97087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available <<< 9396 1727204024.98004: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 9396 1727204024.98024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 9396 1727204024.98208: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528e2b440> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e29790> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e27fb0> <<< 9396 1727204025.13031: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e73050> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e71460> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204025.13152: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e737a0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e72420> <<< 9396 1727204025.13515: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 9396 1727204025.36446: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPMunxA14oEo4zS2fFpjhbXGrPN+Pc6yYvB7bHw1sbQmdYiLO2rxhswFBK4xGCi3vLH7aJpii+0+X9RSsTnEBO0RK8RoPR2xkDNYXW0a1/AwnAaak2bYj0nyDuqJg37AS/oUOb1vdeV3ZYBHlx2BeYopb3qrr4hWKyEoB0Cj4GUfAAAAFQDdJ5ecc5ETygm9XhUUj7x91BVbMwAAAIEA3LP6y0wGAOTGTdHT9uHV9ZnnWPry3FD498XUhfd/8katSmv9dBqFZ5BSlmylNhNOGN/dgGvIysah3TjyiVgAhMIDSxyWXeNKylfCPrSiGgzM8sPtvUHAKjCr4YeqDBRpE2nuYpznop73ZNQ+pIgZqdMFXs4mhUw8Ai2Xc/5SU50AAACAdpxXHRObj5kiSZgGAlvwkslKIUteCSyoGibmNskfiBNzJY3St95HEK+e2xMQboTkyY3hrBqloLoBzSVdWeHcA4Dy35X8VTnKqPND6sF4ZlGbbCJ4i7j+NZNvY9YE7WlAvhx04nuXVm8WrfYDcMosawu6xUqwt2jyqxyZoW4D7vM=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDPMZg3FopBCtnsfqhDL1jxsml2miVdyRUl9+VLqXB9qIsKgdugZ5C55RjTiQ22wsa2Eh+b3yyIsvxFblaHxpQzIRnwsGaUu0F4hFCZXlaF5fc3O+m2QULrdxgV3MgQyWL48mVBiOB+GPPbs7QmzI86NB7uKRLNDd/a1TptTIakCXZG8IzEbICTS8L5pdUZ9xNLEft03pnuhGY/GBZ92mu+wYkGzptfYkjUD3tquOMvoARgvTTX7p7aXxFfSocK0lHZFLYlKMJRVt54wVcnxHlL5CKemFAnA9S+D8LZ5wZeoUJSE/kn/yvpO8XUisytKZyOmRFK/G3+cUWh6Pd7qYdOok4cofVXlyuTOw2mI5oI0U9r4iP+OK/lmhxRulzsX5W/l0DkEwkU1RuSRlvwu5f9EnfGb0i2T/oUti+RAH1DryJz8HsYqwWh73E/eQA3Syq8QjnIsYEPvoPNycnSAARw/gUeutemNV7jA6AoH96WGXVckMWfsvjlKleAA9RzgPc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEVvjAxgU/tcZsEFgZpofCf1HGzA0uJ8gV8vzQgeRwbkH+VzA0Knn/ZxPSYccws9QCPb2xcCpDQhETAIG5RKHc=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAbPQMlsrcdV/DtPY+pi9Fcm1KJrxSB0LaYtXxUu2kxn", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec295f006a273ed037dc1f179c04a840", "ansible_loadavg": {"1m": 0.68310546875, "5m": 0.3642578125, "15m": 0.1630859375}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "44", "epoch": "1727204024", "epoch_int": "1727204024", "date": "2024-09-24", "time": "14:53:44", "iso8601_micro": "2024-09-24T18:53:44.983831Z", "iso8601": "2024-09-24T18:53:44Z", "iso8601_basic": "20240924T145344983831", "iso8601_basic_short": "20240924T145344", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_hostnqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_<<< 9396 1727204025.36519: stdout chunk (state=3): >>>64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2833, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 884, "free": 2833}, "nocache": {"free": 3454, "used": 263}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_uuid": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 515, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251157786624, "block_size": 4096, "block_total": 64479564, "block_available": 61317819, "block_used": 3161745, "inode_total": 16384000, "inode_available": 16302271, "inode_used": 81729, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/u<<< 9396 1727204025.36612: stdout chunk (state=3): >>>sr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50712 10.31.11.210 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50712 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d080:f60d:659:9515", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.210"], "ansible_all_ipv6_addresses": ["fe80::d080:f60d:659:9515"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.210", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d080:f60d:659:9515"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 9396 1727204025.37119: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 <<< 9396 1727204025.37141: stdout chunk (state=3): >>># clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback <<< 9396 1727204025.37161: stdout chunk (state=3): >>># clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections <<< 9396 1727204025.37303: stdout chunk (state=3): >>># cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect <<< 9396 1727204025.37322: stdout chunk (state=3): >>># cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_uti<<< 9396 1727204025.37478: stdout chunk (state=3): >>>ls.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys<<< 9396 1727204025.37506: stdout chunk (state=3): >>> # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub<<< 9396 1727204025.37513: stdout chunk (state=3): >>>_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 9396 1727204025.37895: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner <<< 9396 1727204025.38116: stdout chunk (state=3): >>># destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue <<< 9396 1727204025.38143: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors <<< 9396 1727204025.38173: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 9396 1727204025.38198: stdout chunk (state=3): >>># destroy _ssl <<< 9396 1727204025.38307: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection <<< 9396 1727204025.38398: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep<<< 9396 1727204025.38402: stdout chunk (state=3): >>> # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 9396 1727204025.38405: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 <<< 9396 1727204025.38410: stdout chunk (state=3): >>># cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime <<< 9396 1727204025.38520: stdout chunk (state=3): >>># cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 9396 1727204025.38524: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat <<< 9396 1727204025.38571: stdout chunk (state=3): >>># cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io <<< 9396 1727204025.38577: stdout chunk (state=3): >>># cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 9396 1727204025.38593: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 9396 1727204025.38765: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 9396 1727204025.38785: stdout chunk (state=3): >>># destroy _collections <<< 9396 1727204025.38880: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 9396 1727204025.38916: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 9396 1727204025.38934: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 9396 1727204025.38948: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 9396 1727204025.39094: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 9396 1727204025.39118: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 9396 1727204025.39142: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 9396 1727204025.39187: stdout chunk (state=3): >>># destroy itertools <<< 9396 1727204025.39422: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 9396 1727204025.39610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204025.39627: stderr chunk (state=3): >>>Shared connection to 10.31.11.210 closed. <<< 9396 1727204025.39814: stderr chunk (state=3): >>><<< 9396 1727204025.39818: stdout chunk (state=3): >>><<< 9396 1727204025.40294: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552a0d44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552a0a3ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552a0d6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529ec50a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529ec5fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f03e00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f03ec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f3b800> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f3be90> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f1bad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f191f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f00fb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f5f710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f5e330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f1a1e0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f02ea0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f90740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f00230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529f90bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f90aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529f90e90> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529efed50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f91550> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f91220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f92450> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529fac680> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529faddc0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529faecc0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529faf320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529fae210> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529fafda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529faf4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f924b0> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529cabd10> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529cd4710> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529cd4470> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529cd4740> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529cd4920> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529ca9eb0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529cd5f70> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529cd4bf0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529f92ba0> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d022d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d1a3f0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d571d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d79970> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d572f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d1b080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529ba02c0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529d19430> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529cd6e70> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5529d19550> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_fv74n_4b/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529c06030> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529bdcf20> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529ba3fb0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529bdfec0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529c39af0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529c39880> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529c39190> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529c395e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529c06cc0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529c3a8a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529c3aae0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529c3afc0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a9ce00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529a9ea20> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a9f3b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa0590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa3020> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529aa3170> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa11f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa6ed0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa59a0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa5700> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa7fb0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aa17f0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529aeaf90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aeb140> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529af0d10> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529af0ad0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529af32c0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529af1400> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529afaae0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529af3470> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529afbdd0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529afbbc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529afbf20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529aeb440> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529afeb10> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529affe90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529afd2b0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529afe660> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529afce60> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529988080> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f55299890a0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529b03530> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529989220> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552998b560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529991c40> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f55299925a0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552998a690> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5529991400> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529992840> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a2ab10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552999c7d0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552999bc50> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552999a6f0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a31100> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528f4c140> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528f4c440> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552997d1f0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f552997c7a0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a333b0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a32ab0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528f4f380> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528f4ec30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528f4ee10> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528f4e090> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528f4f4d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528fba000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528f4ffe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5529a32840> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528fbb3b0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528fbac60> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528fe62a0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528fd3140> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528e01df0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e01a60> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5528e2b440> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e29790> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e27fb0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e73050> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e71460> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e737a0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5528e72420> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPMunxA14oEo4zS2fFpjhbXGrPN+Pc6yYvB7bHw1sbQmdYiLO2rxhswFBK4xGCi3vLH7aJpii+0+X9RSsTnEBO0RK8RoPR2xkDNYXW0a1/AwnAaak2bYj0nyDuqJg37AS/oUOb1vdeV3ZYBHlx2BeYopb3qrr4hWKyEoB0Cj4GUfAAAAFQDdJ5ecc5ETygm9XhUUj7x91BVbMwAAAIEA3LP6y0wGAOTGTdHT9uHV9ZnnWPry3FD498XUhfd/8katSmv9dBqFZ5BSlmylNhNOGN/dgGvIysah3TjyiVgAhMIDSxyWXeNKylfCPrSiGgzM8sPtvUHAKjCr4YeqDBRpE2nuYpznop73ZNQ+pIgZqdMFXs4mhUw8Ai2Xc/5SU50AAACAdpxXHRObj5kiSZgGAlvwkslKIUteCSyoGibmNskfiBNzJY3St95HEK+e2xMQboTkyY3hrBqloLoBzSVdWeHcA4Dy35X8VTnKqPND6sF4ZlGbbCJ4i7j+NZNvY9YE7WlAvhx04nuXVm8WrfYDcMosawu6xUqwt2jyqxyZoW4D7vM=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDPMZg3FopBCtnsfqhDL1jxsml2miVdyRUl9+VLqXB9qIsKgdugZ5C55RjTiQ22wsa2Eh+b3yyIsvxFblaHxpQzIRnwsGaUu0F4hFCZXlaF5fc3O+m2QULrdxgV3MgQyWL48mVBiOB+GPPbs7QmzI86NB7uKRLNDd/a1TptTIakCXZG8IzEbICTS8L5pdUZ9xNLEft03pnuhGY/GBZ92mu+wYkGzptfYkjUD3tquOMvoARgvTTX7p7aXxFfSocK0lHZFLYlKMJRVt54wVcnxHlL5CKemFAnA9S+D8LZ5wZeoUJSE/kn/yvpO8XUisytKZyOmRFK/G3+cUWh6Pd7qYdOok4cofVXlyuTOw2mI5oI0U9r4iP+OK/lmhxRulzsX5W/l0DkEwkU1RuSRlvwu5f9EnfGb0i2T/oUti+RAH1DryJz8HsYqwWh73E/eQA3Syq8QjnIsYEPvoPNycnSAARw/gUeutemNV7jA6AoH96WGXVckMWfsvjlKleAA9RzgPc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEVvjAxgU/tcZsEFgZpofCf1HGzA0uJ8gV8vzQgeRwbkH+VzA0Knn/ZxPSYccws9QCPb2xcCpDQhETAIG5RKHc=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAbPQMlsrcdV/DtPY+pi9Fcm1KJrxSB0LaYtXxUu2kxn", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec295f006a273ed037dc1f179c04a840", "ansible_loadavg": {"1m": 0.68310546875, "5m": 0.3642578125, "15m": 0.1630859375}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "44", "epoch": "1727204024", "epoch_int": "1727204024", "date": "2024-09-24", "time": "14:53:44", "iso8601_micro": "2024-09-24T18:53:44.983831Z", "iso8601": "2024-09-24T18:53:44Z", "iso8601_basic": "20240924T145344983831", "iso8601_basic_short": "20240924T145344", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_local": {}, "ansible_hostnqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_is_chroot": false, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2833, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 884, "free": 2833}, "nocache": {"free": 3454, "used": 263}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_uuid": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 515, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251157786624, "block_size": 4096, "block_total": 64479564, "block_available": 61317819, "block_used": 3161745, "inode_total": 16384000, "inode_available": 16302271, "inode_used": 81729, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50712 10.31.11.210 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50712 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d080:f60d:659:9515", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.210"], "ansible_all_ipv6_addresses": ["fe80::d080:f60d:659:9515"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.210", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d080:f60d:659:9515"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 9396 1727204025.44225: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204025.44229: _low_level_execute_command(): starting 9396 1727204025.44232: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204024.0799267-9565-159407349289105/ > /dev/null 2>&1 && sleep 0' 9396 1727204025.45355: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204025.45405: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204025.45651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204025.45664: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204025.45728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204025.47747: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204025.47827: stderr chunk (state=3): >>><<< 9396 1727204025.47830: stdout chunk (state=3): >>><<< 9396 1727204025.47846: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204025.48043: handler run complete 9396 1727204025.48277: variable 'ansible_facts' from source: unknown 9396 1727204025.48517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204025.49294: variable 'ansible_facts' from source: unknown 9396 1727204025.49765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204025.49866: attempt loop complete, returning result 9396 1727204025.49995: _execute() done 9396 1727204025.50003: dumping result to json 9396 1727204025.50034: done dumping result, returning 9396 1727204025.50207: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [12b410aa-8751-36c5-1f9e-0000000000cd] 9396 1727204025.50210: sending task result for task 12b410aa-8751-36c5-1f9e-0000000000cd ok: [managed-node1] 9396 1727204025.51441: no more pending results, returning what we have 9396 1727204025.51445: results queue empty 9396 1727204025.51446: checking for any_errors_fatal 9396 1727204025.51447: done checking for any_errors_fatal 9396 1727204025.51448: checking for max_fail_percentage 9396 1727204025.51450: done checking for max_fail_percentage 9396 1727204025.51451: checking to see if all hosts have failed and the running result is not ok 9396 1727204025.51452: done checking to see if all hosts have failed 9396 1727204025.51453: getting the remaining hosts for this loop 9396 1727204025.51455: done getting the remaining hosts for this loop 9396 1727204025.51460: getting the next task for host managed-node1 9396 1727204025.51466: done getting next task for host managed-node1 9396 1727204025.51469: ^ task is: TASK: meta (flush_handlers) 9396 1727204025.51471: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204025.51475: getting variables 9396 1727204025.51477: in VariableManager get_vars() 9396 1727204025.51706: Calling all_inventory to load vars for managed-node1 9396 1727204025.51710: Calling groups_inventory to load vars for managed-node1 9396 1727204025.51714: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204025.51902: Calling all_plugins_play to load vars for managed-node1 9396 1727204025.51906: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204025.51911: Calling groups_plugins_play to load vars for managed-node1 9396 1727204025.52596: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000000cd 9396 1727204025.52599: WORKER PROCESS EXITING 9396 1727204025.52628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204025.53103: done with get_vars() 9396 1727204025.53115: done getting variables 9396 1727204025.53402: in VariableManager get_vars() 9396 1727204025.53413: Calling all_inventory to load vars for managed-node1 9396 1727204025.53416: Calling groups_inventory to load vars for managed-node1 9396 1727204025.53419: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204025.53425: Calling all_plugins_play to load vars for managed-node1 9396 1727204025.53428: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204025.53432: Calling groups_plugins_play to load vars for managed-node1 9396 1727204025.53821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204025.54282: done with get_vars() 9396 1727204025.54300: done queuing things up, now waiting for results queue to drain 9396 1727204025.54302: results queue empty 9396 1727204025.54303: checking for any_errors_fatal 9396 1727204025.54306: done checking for any_errors_fatal 9396 1727204025.54307: checking for max_fail_percentage 9396 1727204025.54309: done checking for max_fail_percentage 9396 1727204025.54309: checking to see if all hosts have failed and the running result is not ok 9396 1727204025.54310: done checking to see if all hosts have failed 9396 1727204025.54316: getting the remaining hosts for this loop 9396 1727204025.54317: done getting the remaining hosts for this loop 9396 1727204025.54320: getting the next task for host managed-node1 9396 1727204025.54325: done getting next task for host managed-node1 9396 1727204025.54328: ^ task is: TASK: Include the task 'el_repo_setup.yml' 9396 1727204025.54330: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204025.54332: getting variables 9396 1727204025.54334: in VariableManager get_vars() 9396 1727204025.54344: Calling all_inventory to load vars for managed-node1 9396 1727204025.54346: Calling groups_inventory to load vars for managed-node1 9396 1727204025.54349: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204025.54355: Calling all_plugins_play to load vars for managed-node1 9396 1727204025.54358: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204025.54362: Calling groups_plugins_play to load vars for managed-node1 9396 1727204025.54874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204025.55548: done with get_vars() 9396 1727204025.55558: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:11 Tuesday 24 September 2024 14:53:45 -0400 (0:00:01.517) 0:00:01.527 ***** 9396 1727204025.55646: entering _queue_task() for managed-node1/include_tasks 9396 1727204025.55648: Creating lock for include_tasks 9396 1727204025.56381: worker is 1 (out of 1 available) 9396 1727204025.56399: exiting _queue_task() for managed-node1/include_tasks 9396 1727204025.56415: done queuing things up, now waiting for results queue to drain 9396 1727204025.56418: waiting for pending results... 9396 1727204025.56948: running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' 9396 1727204025.57034: in run() - task 12b410aa-8751-36c5-1f9e-000000000006 9396 1727204025.57054: variable 'ansible_search_path' from source: unknown 9396 1727204025.57100: calling self._execute() 9396 1727204025.57375: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204025.57392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204025.57413: variable 'omit' from source: magic vars 9396 1727204025.57895: _execute() done 9396 1727204025.57899: dumping result to json 9396 1727204025.57902: done dumping result, returning 9396 1727204025.57905: done running TaskExecutor() for managed-node1/TASK: Include the task 'el_repo_setup.yml' [12b410aa-8751-36c5-1f9e-000000000006] 9396 1727204025.57910: sending task result for task 12b410aa-8751-36c5-1f9e-000000000006 9396 1727204025.57988: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000006 9396 1727204025.57994: WORKER PROCESS EXITING 9396 1727204025.58045: no more pending results, returning what we have 9396 1727204025.58051: in VariableManager get_vars() 9396 1727204025.58085: Calling all_inventory to load vars for managed-node1 9396 1727204025.58088: Calling groups_inventory to load vars for managed-node1 9396 1727204025.58296: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204025.58308: Calling all_plugins_play to load vars for managed-node1 9396 1727204025.58311: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204025.58315: Calling groups_plugins_play to load vars for managed-node1 9396 1727204025.58528: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204025.59218: done with get_vars() 9396 1727204025.59227: variable 'ansible_search_path' from source: unknown 9396 1727204025.59241: we have included files to process 9396 1727204025.59242: generating all_blocks data 9396 1727204025.59244: done generating all_blocks data 9396 1727204025.59245: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 9396 1727204025.59246: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 9396 1727204025.59249: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 9396 1727204025.60876: in VariableManager get_vars() 9396 1727204025.60897: done with get_vars() 9396 1727204025.60913: done processing included file 9396 1727204025.60915: iterating over new_blocks loaded from include file 9396 1727204025.60917: in VariableManager get_vars() 9396 1727204025.60929: done with get_vars() 9396 1727204025.60930: filtering new block on tags 9396 1727204025.60949: done filtering new block on tags 9396 1727204025.60952: in VariableManager get_vars() 9396 1727204025.60964: done with get_vars() 9396 1727204025.60966: filtering new block on tags 9396 1727204025.60984: done filtering new block on tags 9396 1727204025.60988: in VariableManager get_vars() 9396 1727204025.61206: done with get_vars() 9396 1727204025.61208: filtering new block on tags 9396 1727204025.61225: done filtering new block on tags 9396 1727204025.61228: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node1 9396 1727204025.61235: extending task lists for all hosts with included blocks 9396 1727204025.61504: done extending task lists 9396 1727204025.61505: done processing included files 9396 1727204025.61506: results queue empty 9396 1727204025.61507: checking for any_errors_fatal 9396 1727204025.61509: done checking for any_errors_fatal 9396 1727204025.61510: checking for max_fail_percentage 9396 1727204025.61511: done checking for max_fail_percentage 9396 1727204025.61512: checking to see if all hosts have failed and the running result is not ok 9396 1727204025.61513: done checking to see if all hosts have failed 9396 1727204025.61514: getting the remaining hosts for this loop 9396 1727204025.61515: done getting the remaining hosts for this loop 9396 1727204025.61518: getting the next task for host managed-node1 9396 1727204025.61523: done getting next task for host managed-node1 9396 1727204025.61525: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 9396 1727204025.61528: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204025.61531: getting variables 9396 1727204025.61532: in VariableManager get_vars() 9396 1727204025.61543: Calling all_inventory to load vars for managed-node1 9396 1727204025.61546: Calling groups_inventory to load vars for managed-node1 9396 1727204025.61549: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204025.61555: Calling all_plugins_play to load vars for managed-node1 9396 1727204025.61558: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204025.61562: Calling groups_plugins_play to load vars for managed-node1 9396 1727204025.61981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204025.62457: done with get_vars() 9396 1727204025.62468: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:53:45 -0400 (0:00:00.069) 0:00:01.596 ***** 9396 1727204025.62551: entering _queue_task() for managed-node1/setup 9396 1727204025.63293: worker is 1 (out of 1 available) 9396 1727204025.63307: exiting _queue_task() for managed-node1/setup 9396 1727204025.63322: done queuing things up, now waiting for results queue to drain 9396 1727204025.63324: waiting for pending results... 9396 1727204025.63757: running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 9396 1727204025.63875: in run() - task 12b410aa-8751-36c5-1f9e-0000000000de 9396 1727204025.64197: variable 'ansible_search_path' from source: unknown 9396 1727204025.64201: variable 'ansible_search_path' from source: unknown 9396 1727204025.64204: calling self._execute() 9396 1727204025.64259: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204025.64412: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204025.64595: variable 'omit' from source: magic vars 9396 1727204025.65628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204025.70918: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204025.71179: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204025.71596: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204025.71600: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204025.71602: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204025.71896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204025.71900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204025.71910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204025.71965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204025.71992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204025.72423: variable 'ansible_facts' from source: unknown 9396 1727204025.72683: variable 'network_test_required_facts' from source: task vars 9396 1727204025.72740: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 9396 1727204025.72750: when evaluation is False, skipping this task 9396 1727204025.72758: _execute() done 9396 1727204025.72767: dumping result to json 9396 1727204025.72803: done dumping result, returning 9396 1727204025.72820: done running TaskExecutor() for managed-node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [12b410aa-8751-36c5-1f9e-0000000000de] 9396 1727204025.72916: sending task result for task 12b410aa-8751-36c5-1f9e-0000000000de skipping: [managed-node1] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 9396 1727204025.73206: no more pending results, returning what we have 9396 1727204025.73210: results queue empty 9396 1727204025.73213: checking for any_errors_fatal 9396 1727204025.73216: done checking for any_errors_fatal 9396 1727204025.73217: checking for max_fail_percentage 9396 1727204025.73218: done checking for max_fail_percentage 9396 1727204025.73220: checking to see if all hosts have failed and the running result is not ok 9396 1727204025.73221: done checking to see if all hosts have failed 9396 1727204025.73222: getting the remaining hosts for this loop 9396 1727204025.73223: done getting the remaining hosts for this loop 9396 1727204025.73228: getting the next task for host managed-node1 9396 1727204025.73239: done getting next task for host managed-node1 9396 1727204025.73242: ^ task is: TASK: Check if system is ostree 9396 1727204025.73246: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204025.73249: getting variables 9396 1727204025.73252: in VariableManager get_vars() 9396 1727204025.73283: Calling all_inventory to load vars for managed-node1 9396 1727204025.73287: Calling groups_inventory to load vars for managed-node1 9396 1727204025.73596: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204025.73609: Calling all_plugins_play to load vars for managed-node1 9396 1727204025.73613: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204025.73617: Calling groups_plugins_play to load vars for managed-node1 9396 1727204025.74171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204025.74714: done with get_vars() 9396 1727204025.74727: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:53:45 -0400 (0:00:00.124) 0:00:01.721 ***** 9396 1727204025.75046: entering _queue_task() for managed-node1/stat 9396 1727204025.75141: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000000de 9396 1727204025.75145: WORKER PROCESS EXITING 9396 1727204025.75630: worker is 1 (out of 1 available) 9396 1727204025.75641: exiting _queue_task() for managed-node1/stat 9396 1727204025.75654: done queuing things up, now waiting for results queue to drain 9396 1727204025.75656: waiting for pending results... 9396 1727204025.76062: running TaskExecutor() for managed-node1/TASK: Check if system is ostree 9396 1727204025.76232: in run() - task 12b410aa-8751-36c5-1f9e-0000000000e0 9396 1727204025.76416: variable 'ansible_search_path' from source: unknown 9396 1727204025.76465: variable 'ansible_search_path' from source: unknown 9396 1727204025.76514: calling self._execute() 9396 1727204025.76603: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204025.76711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204025.76731: variable 'omit' from source: magic vars 9396 1727204025.78198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204025.78656: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204025.78719: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204025.78924: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204025.78971: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204025.79082: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204025.79125: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204025.79169: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204025.79213: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204025.79440: Evaluated conditional (not __network_is_ostree is defined): True 9396 1727204025.79520: variable 'omit' from source: magic vars 9396 1727204025.79572: variable 'omit' from source: magic vars 9396 1727204025.79634: variable 'omit' from source: magic vars 9396 1727204025.79669: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204025.79742: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204025.79769: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204025.79801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204025.79824: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204025.79872: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204025.79884: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204025.79896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204025.80032: Set connection var ansible_timeout to 10 9396 1727204025.80050: Set connection var ansible_shell_executable to /bin/sh 9396 1727204025.80072: Set connection var ansible_pipelining to False 9396 1727204025.80085: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204025.80101: Set connection var ansible_connection to ssh 9396 1727204025.80112: Set connection var ansible_shell_type to sh 9396 1727204025.80147: variable 'ansible_shell_executable' from source: unknown 9396 1727204025.80156: variable 'ansible_connection' from source: unknown 9396 1727204025.80171: variable 'ansible_module_compression' from source: unknown 9396 1727204025.80179: variable 'ansible_shell_type' from source: unknown 9396 1727204025.80187: variable 'ansible_shell_executable' from source: unknown 9396 1727204025.80197: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204025.80207: variable 'ansible_pipelining' from source: unknown 9396 1727204025.80278: variable 'ansible_timeout' from source: unknown 9396 1727204025.80280: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204025.80575: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204025.80596: variable 'omit' from source: magic vars 9396 1727204025.80695: starting attempt loop 9396 1727204025.80699: running the handler 9396 1727204025.80702: _low_level_execute_command(): starting 9396 1727204025.80705: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204025.81559: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204025.81636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204025.81663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204025.81692: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204025.81855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204025.83680: stdout chunk (state=3): >>>/root <<< 9396 1727204025.83835: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204025.84046: stdout chunk (state=3): >>><<< 9396 1727204025.84050: stderr chunk (state=3): >>><<< 9396 1727204025.84053: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204025.84066: _low_level_execute_command(): starting 9396 1727204025.84069: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942 `" && echo ansible-tmp-1727204025.8398194-9969-112828906055942="` echo /root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942 `" ) && sleep 0' 9396 1727204025.85508: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204025.85543: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204025.85560: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204025.85576: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204025.85674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204025.87776: stdout chunk (state=3): >>>ansible-tmp-1727204025.8398194-9969-112828906055942=/root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942 <<< 9396 1727204025.87895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204025.88069: stderr chunk (state=3): >>><<< 9396 1727204025.88092: stdout chunk (state=3): >>><<< 9396 1727204025.88114: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204025.8398194-9969-112828906055942=/root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204025.88292: variable 'ansible_module_compression' from source: unknown 9396 1727204025.88360: ANSIBALLZ: Using lock for stat 9396 1727204025.88418: ANSIBALLZ: Acquiring lock 9396 1727204025.88428: ANSIBALLZ: Lock acquired: 139797142491296 9396 1727204025.88439: ANSIBALLZ: Creating module 9396 1727204026.19998: ANSIBALLZ: Writing module into payload 9396 1727204026.20003: ANSIBALLZ: Writing module 9396 1727204026.20005: ANSIBALLZ: Renaming module 9396 1727204026.20010: ANSIBALLZ: Done creating module 9396 1727204026.20013: variable 'ansible_facts' from source: unknown 9396 1727204026.20016: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942/AnsiballZ_stat.py 9396 1727204026.20306: Sending initial data 9396 1727204026.20313: Sent initial data (151 bytes) 9396 1727204026.20842: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204026.20853: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204026.20905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204026.20975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204026.20993: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204026.21014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204026.21172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204026.22957: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204026.23000: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204026.23081: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpgc5mreba /root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942/AnsiballZ_stat.py <<< 9396 1727204026.23085: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942/AnsiballZ_stat.py" <<< 9396 1727204026.23124: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpgc5mreba" to remote "/root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942/AnsiballZ_stat.py" <<< 9396 1727204026.25204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204026.25494: stderr chunk (state=3): >>><<< 9396 1727204026.25499: stdout chunk (state=3): >>><<< 9396 1727204026.25501: done transferring module to remote 9396 1727204026.25504: _low_level_execute_command(): starting 9396 1727204026.25507: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942/ /root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942/AnsiballZ_stat.py && sleep 0' 9396 1727204026.27004: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204026.27012: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204026.27015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204026.27018: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 9396 1727204026.27025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204026.27029: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204026.27306: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204026.29299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204026.29450: stderr chunk (state=3): >>><<< 9396 1727204026.29454: stdout chunk (state=3): >>><<< 9396 1727204026.29475: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204026.29479: _low_level_execute_command(): starting 9396 1727204026.29486: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942/AnsiballZ_stat.py && sleep 0' 9396 1727204026.30920: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204026.30925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204026.30928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 9396 1727204026.30930: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204026.30933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204026.31043: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204026.31051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204026.31111: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204026.31192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204026.33485: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 9396 1727204026.33514: stdout chunk (state=3): >>>import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 9396 1727204026.33725: stdout chunk (state=3): >>>import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 9396 1727204026.33731: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204026.33754: stdout chunk (state=3): >>>import '_codecs' # <<< 9396 1727204026.33789: stdout chunk (state=3): >>>import 'codecs' # <<< 9396 1727204026.33851: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 9396 1727204026.33867: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a60c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a5dbad0> <<< 9396 1727204026.33893: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 9396 1727204026.33928: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a60ea20> <<< 9396 1727204026.33938: stdout chunk (state=3): >>>import '_signal' # <<< 9396 1727204026.33953: stdout chunk (state=3): >>>import '_abc' # <<< 9396 1727204026.33960: stdout chunk (state=3): >>>import 'abc' # <<< 9396 1727204026.33968: stdout chunk (state=3): >>>import 'io' # <<< 9396 1727204026.34012: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 9396 1727204026.34117: stdout chunk (state=3): >>>import '_collections_abc' # <<< 9396 1727204026.34150: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 9396 1727204026.34188: stdout chunk (state=3): >>>import 'os' # <<< 9396 1727204026.34202: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 9396 1727204026.34236: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 9396 1727204026.34268: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 9396 1727204026.34295: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 9396 1727204026.34310: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4010a0> <<< 9396 1727204026.34372: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 9396 1727204026.34386: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a401fd0> <<< 9396 1727204026.34416: stdout chunk (state=3): >>>import 'site' # <<< 9396 1727204026.34448: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 9396 1727204026.34714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 9396 1727204026.34720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 9396 1727204026.34759: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204026.34763: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 9396 1727204026.34819: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 9396 1727204026.34834: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 9396 1727204026.34862: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a43fec0> <<< 9396 1727204026.34900: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 9396 1727204026.34929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 9396 1727204026.34955: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a43ff80> <<< 9396 1727204026.34973: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 9396 1727204026.35002: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 9396 1727204026.35022: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 9396 1727204026.35073: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204026.35117: stdout chunk (state=3): >>>import 'itertools' # <<< 9396 1727204026.35124: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4778c0> <<< 9396 1727204026.35168: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a477f50> <<< 9396 1727204026.35171: stdout chunk (state=3): >>>import '_collections' # <<< 9396 1727204026.35228: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a457b60> import '_functools' # <<< 9396 1727204026.35270: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4552b0> <<< 9396 1727204026.35369: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a43d070> <<< 9396 1727204026.35418: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 9396 1727204026.35422: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 9396 1727204026.35448: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 9396 1727204026.35499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 9396 1727204026.35514: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 9396 1727204026.35541: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a49b890> <<< 9396 1727204026.35581: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a49a4b0> <<< 9396 1727204026.35585: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 9396 1727204026.35616: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4562a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a498bc0> <<< 9396 1727204026.35661: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 9396 1727204026.35687: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4cc800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a43c2f0> <<< 9396 1727204026.35727: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 9396 1727204026.35784: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a4cccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4ccb60> <<< 9396 1727204026.35821: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a4ccf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a43ae10> <<< 9396 1727204026.35845: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204026.35879: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 9396 1727204026.35885: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 9396 1727204026.35940: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4cd610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4cd2e0> import 'importlib.machinery' # <<< 9396 1727204026.35944: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 9396 1727204026.35976: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4ce510> import 'importlib.util' # import 'runpy' # <<< 9396 1727204026.36143: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 9396 1727204026.36149: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4e8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a4e9e80> <<< 9396 1727204026.36174: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 9396 1727204026.36205: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 9396 1727204026.36221: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4ead80> <<< 9396 1727204026.36292: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a4eb3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4ea2d0> <<< 9396 1727204026.36328: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 9396 1727204026.36397: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a4ebe30> <<< 9396 1727204026.36420: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4eb560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4ce570> <<< 9396 1727204026.36436: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 9396 1727204026.36734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 9396 1727204026.36738: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 9396 1727204026.36742: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a2b7d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a2e07d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a2e0530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a2e0800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a2e09e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a2b5ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 9396 1727204026.36842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 9396 1727204026.36863: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 9396 1727204026.36883: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a2e2000> <<< 9396 1727204026.36913: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a2e0c80> <<< 9396 1727204026.36928: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4cec60> <<< 9396 1727204026.36960: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 9396 1727204026.37010: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204026.37030: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 9396 1727204026.37070: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 9396 1727204026.37121: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a30e390> <<< 9396 1727204026.37167: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 9396 1727204026.37191: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204026.37226: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 9396 1727204026.37229: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 9396 1727204026.37284: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a326540> <<< 9396 1727204026.37301: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 9396 1727204026.37333: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 9396 1727204026.37408: stdout chunk (state=3): >>>import 'ntpath' # <<< 9396 1727204026.37431: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a35f2f0> <<< 9396 1727204026.37450: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 9396 1727204026.37501: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 9396 1727204026.37519: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 9396 1727204026.37556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 9396 1727204026.37651: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a385a90> <<< 9396 1727204026.37733: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a35f410> <<< 9396 1727204026.37777: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a3271d0> <<< 9396 1727204026.37825: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a15c440> <<< 9396 1727204026.37841: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a325580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a2e2f30> <<< 9396 1727204026.37956: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 9396 1727204026.37968: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcd5a15c6e0> <<< 9396 1727204026.38043: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_hwum8u21/ansible_stat_payload.zip' # zipimport: zlib available <<< 9396 1727204026.38211: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.38238: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 9396 1727204026.38249: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 9396 1727204026.38296: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 9396 1727204026.38374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 9396 1727204026.38418: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1b6120> import '_typing' # <<< 9396 1727204026.38635: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a18d0a0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a18c200> <<< 9396 1727204026.38652: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.38679: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 9396 1727204026.38722: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 9396 1727204026.38725: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 9396 1727204026.38755: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.40509: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.41625: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 9396 1727204026.41629: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a18f1a0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 9396 1727204026.41656: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204026.41675: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 9396 1727204026.41700: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 9396 1727204026.41731: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204026.41750: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a1e1b50> <<< 9396 1727204026.41767: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1e18e0> <<< 9396 1727204026.41806: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1e11f0> <<< 9396 1727204026.41829: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 9396 1727204026.41862: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1e1c40> <<< 9396 1727204026.41895: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1b6bd0> import 'atexit' # <<< 9396 1727204026.41918: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a1e2900> <<< 9396 1727204026.41952: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204026.41963: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a1e2b40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 9396 1727204026.42021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 9396 1727204026.42035: stdout chunk (state=3): >>>import '_locale' # <<< 9396 1727204026.42237: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1e3020> import 'pwd' # <<< 9396 1727204026.42241: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a044dd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0469f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 9396 1727204026.42244: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 9396 1727204026.42283: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0473b0> <<< 9396 1727204026.42303: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 9396 1727204026.42325: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 9396 1727204026.42348: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a048590> <<< 9396 1727204026.42368: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 9396 1727204026.42415: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 9396 1727204026.42429: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 9396 1727204026.42514: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a04b080> <<< 9396 1727204026.42539: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a04b1d0> <<< 9396 1727204026.42728: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a049340> <<< 9396 1727204026.42731: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a04ef60> import '_tokenize' # <<< 9396 1727204026.42811: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a04da60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a04d7c0> <<< 9396 1727204026.42818: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 9396 1727204026.42835: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 9396 1727204026.42911: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a04fe60> <<< 9396 1727204026.42937: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a049850> <<< 9396 1727204026.42964: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0970b0><<< 9396 1727204026.42985: stdout chunk (state=3): >>> <<< 9396 1727204026.42999: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 9396 1727204026.43061: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0972c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 9396 1727204026.43085: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 9396 1727204026.43405: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a09cd70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a09cb30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a09f290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a09d400> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 9396 1727204026.43418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204026.43421: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 9396 1727204026.43424: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 9396 1727204026.43470: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0a2ab0> <<< 9396 1727204026.43627: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a09f440> <<< 9396 1727204026.43781: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0a38c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0a38f0> <<< 9396 1727204026.43803: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0a3ce0> <<< 9396 1727204026.43845: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0974a0> <<< 9396 1727204026.43850: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 9396 1727204026.43874: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 9396 1727204026.43899: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 9396 1727204026.43969: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0a7380> <<< 9396 1727204026.44154: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204026.44239: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0a8710> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0a5af0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204026.44296: stdout chunk (state=3): >>># extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0a6ea0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0a5700> # zipimport: zlib available <<< 9396 1727204026.44303: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 9396 1727204026.44327: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.44400: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.44694: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # <<< 9396 1727204026.44701: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.44705: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 9396 1727204026.44707: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.44727: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.44861: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.45606: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.46312: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 9396 1727204026.46339: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 9396 1727204026.46382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204026.46433: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a1308f0> <<< 9396 1727204026.46547: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 9396 1727204026.46595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1315e0> <<< 9396 1727204026.46608: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0ab170> <<< 9396 1727204026.46650: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 9396 1727204026.46675: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.46716: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.46729: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 9396 1727204026.46922: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.47128: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 9396 1727204026.47137: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a131640> <<< 9396 1727204026.47166: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.47746: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.48398: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.48437: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.48526: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 9396 1727204026.48529: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.48626: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 9396 1727204026.48703: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.48969: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 9396 1727204026.48972: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 9396 1727204026.48975: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.49262: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.49568: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 9396 1727204026.49636: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 9396 1727204026.49651: stdout chunk (state=3): >>>import '_ast' # <<< 9396 1727204026.49754: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a132510> <<< 9396 1727204026.49768: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.49838: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.49947: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 9396 1727204026.49985: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 9396 1727204026.50078: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 9396 1727204026.50204: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd59f3e150> <<< 9396 1727204026.50273: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd59f3eae0> <<< 9396 1727204026.50297: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a133050> # zipimport: zlib available <<< 9396 1727204026.50352: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.50399: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 9396 1727204026.50418: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.50451: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.50506: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.50565: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.50644: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 9396 1727204026.50703: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204026.50810: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd59f3d850> <<< 9396 1727204026.50846: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd59f3ecf0> <<< 9396 1727204026.50899: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 9396 1727204026.50905: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.50966: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.51052: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.51065: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.51118: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 9396 1727204026.51141: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 9396 1727204026.51186: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 9396 1727204026.51192: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 9396 1727204026.51258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 9396 1727204026.51291: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 9396 1727204026.51306: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 9396 1727204026.51353: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd59fceea0> <<< 9396 1727204026.51416: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd59f4bd10> <<< 9396 1727204026.51511: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd59f46cf0> <<< 9396 1727204026.51542: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd59f46b40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 9396 1727204026.51548: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 9396 1727204026.51569: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 9396 1727204026.51654: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 9396 1727204026.51676: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 9396 1727204026.51693: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.51833: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.52054: stdout chunk (state=3): >>># zipimport: zlib available <<< 9396 1727204026.52272: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 9396 1727204026.52656: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc <<< 9396 1727204026.52692: stdout chunk (state=3): >>># cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing <<< 9396 1727204026.52719: stdout chunk (state=3): >>># cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime<<< 9396 1727204026.52767: stdout chunk (state=3): >>> # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 9396 1727204026.53134: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib <<< 9396 1727204026.53140: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal <<< 9396 1727204026.53217: stdout chunk (state=3): >>># destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno<<< 9396 1727204026.53327: stdout chunk (state=3): >>> # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux <<< 9396 1727204026.53333: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing <<< 9396 1727204026.53368: stdout chunk (state=3): >>># cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 9396 1727204026.53445: stdout chunk (state=3): >>># cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs <<< 9396 1727204026.53520: stdout chunk (state=3): >>># cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 9396 1727204026.53595: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 9396 1727204026.53668: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 9396 1727204026.53704: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 9396 1727204026.53771: stdout chunk (state=3): >>># destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 9396 1727204026.53884: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 9396 1727204026.53957: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools <<< 9396 1727204026.53974: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 9396 1727204026.54415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204026.54794: stderr chunk (state=3): >>><<< 9396 1727204026.54797: stdout chunk (state=3): >>><<< 9396 1727204026.54813: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a60c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a5dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a60ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4010a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a401fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a43fec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a43ff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4778c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a477f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a457b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4552b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a43d070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a49b890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a49a4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4562a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a498bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4cc800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a43c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a4cccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4ccb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a4ccf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a43ae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4cd610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4cd2e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4ce510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4e8740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a4e9e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4ead80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a4eb3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4ea2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a4ebe30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4eb560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4ce570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a2b7d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a2e07d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a2e0530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a2e0800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a2e09e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a2b5ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a2e2000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a2e0c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a4cec60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a30e390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a326540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a35f2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a385a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a35f410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a3271d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a15c440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a325580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a2e2f30> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fcd5a15c6e0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_hwum8u21/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1b6120> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a18d0a0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a18c200> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a18f1a0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a1e1b50> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1e18e0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1e11f0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1e1c40> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1b6bd0> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a1e2900> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a1e2b40> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1e3020> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a044dd0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0469f0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0473b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a048590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a04b080> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a04b1d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a049340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a04ef60> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a04da60> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a04d7c0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a04fe60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a049850> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0970b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0972c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a09cd70> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a09cb30> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a09f290> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a09d400> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0a2ab0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a09f440> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0a38c0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0a38f0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0a3ce0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0974a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0a7380> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0a8710> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0a5af0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a0a6ea0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0a5700> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd5a1308f0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a1315e0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a0ab170> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a131640> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a132510> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd59f3e150> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd59f3eae0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd5a133050> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fcd59f3d850> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd59f3ecf0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd59fceea0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd59f4bd10> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd59f46cf0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fcd59f46b40> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 9396 1727204026.56405: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204026.56412: _low_level_execute_command(): starting 9396 1727204026.56415: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204025.8398194-9969-112828906055942/ > /dev/null 2>&1 && sleep 0' 9396 1727204026.56456: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204026.56460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204026.56462: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204026.56464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204026.56731: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204026.56735: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204026.56769: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204026.56845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204026.58803: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204026.58869: stderr chunk (state=3): >>><<< 9396 1727204026.58873: stdout chunk (state=3): >>><<< 9396 1727204026.58892: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204026.58900: handler run complete 9396 1727204026.58930: attempt loop complete, returning result 9396 1727204026.58933: _execute() done 9396 1727204026.58936: dumping result to json 9396 1727204026.58941: done dumping result, returning 9396 1727204026.58951: done running TaskExecutor() for managed-node1/TASK: Check if system is ostree [12b410aa-8751-36c5-1f9e-0000000000e0] 9396 1727204026.58958: sending task result for task 12b410aa-8751-36c5-1f9e-0000000000e0 9396 1727204026.59064: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000000e0 9396 1727204026.59067: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 9396 1727204026.59172: no more pending results, returning what we have 9396 1727204026.59175: results queue empty 9396 1727204026.59176: checking for any_errors_fatal 9396 1727204026.59183: done checking for any_errors_fatal 9396 1727204026.59184: checking for max_fail_percentage 9396 1727204026.59186: done checking for max_fail_percentage 9396 1727204026.59187: checking to see if all hosts have failed and the running result is not ok 9396 1727204026.59188: done checking to see if all hosts have failed 9396 1727204026.59191: getting the remaining hosts for this loop 9396 1727204026.59192: done getting the remaining hosts for this loop 9396 1727204026.59197: getting the next task for host managed-node1 9396 1727204026.59204: done getting next task for host managed-node1 9396 1727204026.59209: ^ task is: TASK: Set flag to indicate system is ostree 9396 1727204026.59212: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.59216: getting variables 9396 1727204026.59218: in VariableManager get_vars() 9396 1727204026.59247: Calling all_inventory to load vars for managed-node1 9396 1727204026.59250: Calling groups_inventory to load vars for managed-node1 9396 1727204026.59254: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.59265: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.59268: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.59272: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.59607: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.59872: done with get_vars() 9396 1727204026.59886: done getting variables 9396 1727204026.60215: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:53:46 -0400 (0:00:00.852) 0:00:02.573 ***** 9396 1727204026.60249: entering _queue_task() for managed-node1/set_fact 9396 1727204026.60251: Creating lock for set_fact 9396 1727204026.60751: worker is 1 (out of 1 available) 9396 1727204026.60765: exiting _queue_task() for managed-node1/set_fact 9396 1727204026.60778: done queuing things up, now waiting for results queue to drain 9396 1727204026.60780: waiting for pending results... 9396 1727204026.61292: running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree 9396 1727204026.61338: in run() - task 12b410aa-8751-36c5-1f9e-0000000000e1 9396 1727204026.61360: variable 'ansible_search_path' from source: unknown 9396 1727204026.61368: variable 'ansible_search_path' from source: unknown 9396 1727204026.61421: calling self._execute() 9396 1727204026.61519: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.61544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.61563: variable 'omit' from source: magic vars 9396 1727204026.62258: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204026.62567: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204026.62634: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204026.62682: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204026.62747: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204026.62856: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204026.62893: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204026.62965: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204026.62979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204026.63135: Evaluated conditional (not __network_is_ostree is defined): True 9396 1727204026.63151: variable 'omit' from source: magic vars 9396 1727204026.63293: variable 'omit' from source: magic vars 9396 1727204026.63377: variable '__ostree_booted_stat' from source: set_fact 9396 1727204026.63449: variable 'omit' from source: magic vars 9396 1727204026.63482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204026.63533: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204026.63561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204026.63587: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204026.63617: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204026.63656: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204026.63666: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.63675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.63815: Set connection var ansible_timeout to 10 9396 1727204026.63894: Set connection var ansible_shell_executable to /bin/sh 9396 1727204026.63897: Set connection var ansible_pipelining to False 9396 1727204026.63899: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204026.63902: Set connection var ansible_connection to ssh 9396 1727204026.63904: Set connection var ansible_shell_type to sh 9396 1727204026.63912: variable 'ansible_shell_executable' from source: unknown 9396 1727204026.63921: variable 'ansible_connection' from source: unknown 9396 1727204026.63928: variable 'ansible_module_compression' from source: unknown 9396 1727204026.63935: variable 'ansible_shell_type' from source: unknown 9396 1727204026.63951: variable 'ansible_shell_executable' from source: unknown 9396 1727204026.63959: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.63968: variable 'ansible_pipelining' from source: unknown 9396 1727204026.63975: variable 'ansible_timeout' from source: unknown 9396 1727204026.63984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.64122: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204026.64141: variable 'omit' from source: magic vars 9396 1727204026.64165: starting attempt loop 9396 1727204026.64169: running the handler 9396 1727204026.64276: handler run complete 9396 1727204026.64279: attempt loop complete, returning result 9396 1727204026.64282: _execute() done 9396 1727204026.64285: dumping result to json 9396 1727204026.64287: done dumping result, returning 9396 1727204026.64290: done running TaskExecutor() for managed-node1/TASK: Set flag to indicate system is ostree [12b410aa-8751-36c5-1f9e-0000000000e1] 9396 1727204026.64294: sending task result for task 12b410aa-8751-36c5-1f9e-0000000000e1 9396 1727204026.64373: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000000e1 9396 1727204026.64491: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 9396 1727204026.64552: no more pending results, returning what we have 9396 1727204026.64555: results queue empty 9396 1727204026.64557: checking for any_errors_fatal 9396 1727204026.64562: done checking for any_errors_fatal 9396 1727204026.64563: checking for max_fail_percentage 9396 1727204026.64565: done checking for max_fail_percentage 9396 1727204026.64565: checking to see if all hosts have failed and the running result is not ok 9396 1727204026.64566: done checking to see if all hosts have failed 9396 1727204026.64567: getting the remaining hosts for this loop 9396 1727204026.64569: done getting the remaining hosts for this loop 9396 1727204026.64573: getting the next task for host managed-node1 9396 1727204026.64581: done getting next task for host managed-node1 9396 1727204026.64584: ^ task is: TASK: Fix CentOS6 Base repo 9396 1727204026.64587: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.64594: getting variables 9396 1727204026.64595: in VariableManager get_vars() 9396 1727204026.64624: Calling all_inventory to load vars for managed-node1 9396 1727204026.64627: Calling groups_inventory to load vars for managed-node1 9396 1727204026.64631: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.64642: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.64645: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.64654: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.64943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.65352: done with get_vars() 9396 1727204026.65364: done getting variables 9396 1727204026.65707: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:53:46 -0400 (0:00:00.054) 0:00:02.628 ***** 9396 1727204026.65740: entering _queue_task() for managed-node1/copy 9396 1727204026.66320: worker is 1 (out of 1 available) 9396 1727204026.66335: exiting _queue_task() for managed-node1/copy 9396 1727204026.66350: done queuing things up, now waiting for results queue to drain 9396 1727204026.66352: waiting for pending results... 9396 1727204026.66714: running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo 9396 1727204026.66735: in run() - task 12b410aa-8751-36c5-1f9e-0000000000e3 9396 1727204026.66751: variable 'ansible_search_path' from source: unknown 9396 1727204026.66754: variable 'ansible_search_path' from source: unknown 9396 1727204026.66836: calling self._execute() 9396 1727204026.66945: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.66949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.66953: variable 'omit' from source: magic vars 9396 1727204026.67463: variable 'ansible_distribution' from source: facts 9396 1727204026.67514: Evaluated conditional (ansible_distribution == 'CentOS'): False 9396 1727204026.67518: when evaluation is False, skipping this task 9396 1727204026.67521: _execute() done 9396 1727204026.67526: dumping result to json 9396 1727204026.67531: done dumping result, returning 9396 1727204026.67657: done running TaskExecutor() for managed-node1/TASK: Fix CentOS6 Base repo [12b410aa-8751-36c5-1f9e-0000000000e3] 9396 1727204026.67661: sending task result for task 12b410aa-8751-36c5-1f9e-0000000000e3 9396 1727204026.67752: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000000e3 9396 1727204026.67756: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 9396 1727204026.67810: no more pending results, returning what we have 9396 1727204026.67814: results queue empty 9396 1727204026.67815: checking for any_errors_fatal 9396 1727204026.67882: done checking for any_errors_fatal 9396 1727204026.67883: checking for max_fail_percentage 9396 1727204026.67885: done checking for max_fail_percentage 9396 1727204026.67887: checking to see if all hosts have failed and the running result is not ok 9396 1727204026.67888: done checking to see if all hosts have failed 9396 1727204026.67890: getting the remaining hosts for this loop 9396 1727204026.67892: done getting the remaining hosts for this loop 9396 1727204026.67896: getting the next task for host managed-node1 9396 1727204026.67902: done getting next task for host managed-node1 9396 1727204026.67906: ^ task is: TASK: Include the task 'enable_epel.yml' 9396 1727204026.67909: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.67913: getting variables 9396 1727204026.67915: in VariableManager get_vars() 9396 1727204026.67944: Calling all_inventory to load vars for managed-node1 9396 1727204026.67948: Calling groups_inventory to load vars for managed-node1 9396 1727204026.67952: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.67964: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.67968: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.67972: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.68293: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.68566: done with get_vars() 9396 1727204026.68578: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:53:46 -0400 (0:00:00.029) 0:00:02.657 ***** 9396 1727204026.68680: entering _queue_task() for managed-node1/include_tasks 9396 1727204026.68940: worker is 1 (out of 1 available) 9396 1727204026.68951: exiting _queue_task() for managed-node1/include_tasks 9396 1727204026.68963: done queuing things up, now waiting for results queue to drain 9396 1727204026.68965: waiting for pending results... 9396 1727204026.69310: running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' 9396 1727204026.69317: in run() - task 12b410aa-8751-36c5-1f9e-0000000000e4 9396 1727204026.69335: variable 'ansible_search_path' from source: unknown 9396 1727204026.69344: variable 'ansible_search_path' from source: unknown 9396 1727204026.69376: calling self._execute() 9396 1727204026.69464: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.69469: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.69571: variable 'omit' from source: magic vars 9396 1727204026.70922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204026.74385: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204026.74388: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204026.74435: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204026.74481: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204026.74522: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204026.74621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204026.74662: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204026.74700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204026.74760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204026.74782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204026.74923: variable '__network_is_ostree' from source: set_fact 9396 1727204026.74950: Evaluated conditional (not __network_is_ostree | d(false)): True 9396 1727204026.74962: _execute() done 9396 1727204026.74970: dumping result to json 9396 1727204026.74978: done dumping result, returning 9396 1727204026.74991: done running TaskExecutor() for managed-node1/TASK: Include the task 'enable_epel.yml' [12b410aa-8751-36c5-1f9e-0000000000e4] 9396 1727204026.75003: sending task result for task 12b410aa-8751-36c5-1f9e-0000000000e4 9396 1727204026.75173: no more pending results, returning what we have 9396 1727204026.75180: in VariableManager get_vars() 9396 1727204026.75219: Calling all_inventory to load vars for managed-node1 9396 1727204026.75223: Calling groups_inventory to load vars for managed-node1 9396 1727204026.75228: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.75241: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.75246: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.75250: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.76118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.76707: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000000e4 9396 1727204026.76710: WORKER PROCESS EXITING 9396 1727204026.76717: done with get_vars() 9396 1727204026.76726: variable 'ansible_search_path' from source: unknown 9396 1727204026.76728: variable 'ansible_search_path' from source: unknown 9396 1727204026.76773: we have included files to process 9396 1727204026.76774: generating all_blocks data 9396 1727204026.76776: done generating all_blocks data 9396 1727204026.76782: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 9396 1727204026.76784: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 9396 1727204026.76787: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 9396 1727204026.77801: done processing included file 9396 1727204026.77804: iterating over new_blocks loaded from include file 9396 1727204026.77806: in VariableManager get_vars() 9396 1727204026.77821: done with get_vars() 9396 1727204026.77823: filtering new block on tags 9396 1727204026.77851: done filtering new block on tags 9396 1727204026.77855: in VariableManager get_vars() 9396 1727204026.77888: done with get_vars() 9396 1727204026.77892: filtering new block on tags 9396 1727204026.77908: done filtering new block on tags 9396 1727204026.77911: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node1 9396 1727204026.77917: extending task lists for all hosts with included blocks 9396 1727204026.78051: done extending task lists 9396 1727204026.78053: done processing included files 9396 1727204026.78054: results queue empty 9396 1727204026.78055: checking for any_errors_fatal 9396 1727204026.78059: done checking for any_errors_fatal 9396 1727204026.78060: checking for max_fail_percentage 9396 1727204026.78061: done checking for max_fail_percentage 9396 1727204026.78062: checking to see if all hosts have failed and the running result is not ok 9396 1727204026.78063: done checking to see if all hosts have failed 9396 1727204026.78064: getting the remaining hosts for this loop 9396 1727204026.78066: done getting the remaining hosts for this loop 9396 1727204026.78068: getting the next task for host managed-node1 9396 1727204026.78073: done getting next task for host managed-node1 9396 1727204026.78076: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 9396 1727204026.78079: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.78082: getting variables 9396 1727204026.78083: in VariableManager get_vars() 9396 1727204026.78095: Calling all_inventory to load vars for managed-node1 9396 1727204026.78097: Calling groups_inventory to load vars for managed-node1 9396 1727204026.78101: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.78107: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.78115: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.78119: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.78310: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.78577: done with get_vars() 9396 1727204026.78587: done getting variables 9396 1727204026.78663: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 9396 1727204026.78908: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:53:46 -0400 (0:00:00.102) 0:00:02.760 ***** 9396 1727204026.78959: entering _queue_task() for managed-node1/command 9396 1727204026.78962: Creating lock for command 9396 1727204026.79266: worker is 1 (out of 1 available) 9396 1727204026.79279: exiting _queue_task() for managed-node1/command 9396 1727204026.79294: done queuing things up, now waiting for results queue to drain 9396 1727204026.79296: waiting for pending results... 9396 1727204026.79558: running TaskExecutor() for managed-node1/TASK: Create EPEL 39 9396 1727204026.79702: in run() - task 12b410aa-8751-36c5-1f9e-0000000000fe 9396 1727204026.79723: variable 'ansible_search_path' from source: unknown 9396 1727204026.79731: variable 'ansible_search_path' from source: unknown 9396 1727204026.79772: calling self._execute() 9396 1727204026.79865: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.79879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.79903: variable 'omit' from source: magic vars 9396 1727204026.80381: variable 'ansible_distribution' from source: facts 9396 1727204026.80401: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 9396 1727204026.80410: when evaluation is False, skipping this task 9396 1727204026.80418: _execute() done 9396 1727204026.80426: dumping result to json 9396 1727204026.80437: done dumping result, returning 9396 1727204026.80451: done running TaskExecutor() for managed-node1/TASK: Create EPEL 39 [12b410aa-8751-36c5-1f9e-0000000000fe] 9396 1727204026.80464: sending task result for task 12b410aa-8751-36c5-1f9e-0000000000fe skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 9396 1727204026.80646: no more pending results, returning what we have 9396 1727204026.80651: results queue empty 9396 1727204026.80652: checking for any_errors_fatal 9396 1727204026.80654: done checking for any_errors_fatal 9396 1727204026.80655: checking for max_fail_percentage 9396 1727204026.80657: done checking for max_fail_percentage 9396 1727204026.80658: checking to see if all hosts have failed and the running result is not ok 9396 1727204026.80659: done checking to see if all hosts have failed 9396 1727204026.80660: getting the remaining hosts for this loop 9396 1727204026.80661: done getting the remaining hosts for this loop 9396 1727204026.80666: getting the next task for host managed-node1 9396 1727204026.80673: done getting next task for host managed-node1 9396 1727204026.80677: ^ task is: TASK: Install yum-utils package 9396 1727204026.80681: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.80685: getting variables 9396 1727204026.80688: in VariableManager get_vars() 9396 1727204026.80722: Calling all_inventory to load vars for managed-node1 9396 1727204026.80726: Calling groups_inventory to load vars for managed-node1 9396 1727204026.80731: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.80747: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.80751: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.80755: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.81246: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000000fe 9396 1727204026.81250: WORKER PROCESS EXITING 9396 1727204026.81278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.81565: done with get_vars() 9396 1727204026.81576: done getting variables 9396 1727204026.81684: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:53:46 -0400 (0:00:00.027) 0:00:02.788 ***** 9396 1727204026.81719: entering _queue_task() for managed-node1/package 9396 1727204026.81721: Creating lock for package 9396 1727204026.81987: worker is 1 (out of 1 available) 9396 1727204026.82001: exiting _queue_task() for managed-node1/package 9396 1727204026.82013: done queuing things up, now waiting for results queue to drain 9396 1727204026.82015: waiting for pending results... 9396 1727204026.82266: running TaskExecutor() for managed-node1/TASK: Install yum-utils package 9396 1727204026.82411: in run() - task 12b410aa-8751-36c5-1f9e-0000000000ff 9396 1727204026.82433: variable 'ansible_search_path' from source: unknown 9396 1727204026.82443: variable 'ansible_search_path' from source: unknown 9396 1727204026.82483: calling self._execute() 9396 1727204026.82574: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.82591: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.82612: variable 'omit' from source: magic vars 9396 1727204026.83051: variable 'ansible_distribution' from source: facts 9396 1727204026.83071: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 9396 1727204026.83080: when evaluation is False, skipping this task 9396 1727204026.83087: _execute() done 9396 1727204026.83100: dumping result to json 9396 1727204026.83110: done dumping result, returning 9396 1727204026.83122: done running TaskExecutor() for managed-node1/TASK: Install yum-utils package [12b410aa-8751-36c5-1f9e-0000000000ff] 9396 1727204026.83137: sending task result for task 12b410aa-8751-36c5-1f9e-0000000000ff skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 9396 1727204026.83312: no more pending results, returning what we have 9396 1727204026.83316: results queue empty 9396 1727204026.83317: checking for any_errors_fatal 9396 1727204026.83324: done checking for any_errors_fatal 9396 1727204026.83325: checking for max_fail_percentage 9396 1727204026.83327: done checking for max_fail_percentage 9396 1727204026.83328: checking to see if all hosts have failed and the running result is not ok 9396 1727204026.83329: done checking to see if all hosts have failed 9396 1727204026.83330: getting the remaining hosts for this loop 9396 1727204026.83332: done getting the remaining hosts for this loop 9396 1727204026.83336: getting the next task for host managed-node1 9396 1727204026.83344: done getting next task for host managed-node1 9396 1727204026.83347: ^ task is: TASK: Enable EPEL 7 9396 1727204026.83351: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.83355: getting variables 9396 1727204026.83357: in VariableManager get_vars() 9396 1727204026.83391: Calling all_inventory to load vars for managed-node1 9396 1727204026.83395: Calling groups_inventory to load vars for managed-node1 9396 1727204026.83399: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.83415: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.83420: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.83424: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.83851: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000000ff 9396 1727204026.83855: WORKER PROCESS EXITING 9396 1727204026.83881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.84179: done with get_vars() 9396 1727204026.84191: done getting variables 9396 1727204026.84253: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:53:46 -0400 (0:00:00.025) 0:00:02.813 ***** 9396 1727204026.84285: entering _queue_task() for managed-node1/command 9396 1727204026.84530: worker is 1 (out of 1 available) 9396 1727204026.84543: exiting _queue_task() for managed-node1/command 9396 1727204026.84555: done queuing things up, now waiting for results queue to drain 9396 1727204026.84557: waiting for pending results... 9396 1727204026.84814: running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 9396 1727204026.84948: in run() - task 12b410aa-8751-36c5-1f9e-000000000100 9396 1727204026.84968: variable 'ansible_search_path' from source: unknown 9396 1727204026.84977: variable 'ansible_search_path' from source: unknown 9396 1727204026.85195: calling self._execute() 9396 1727204026.85198: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.85202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.85204: variable 'omit' from source: magic vars 9396 1727204026.85573: variable 'ansible_distribution' from source: facts 9396 1727204026.85596: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 9396 1727204026.85604: when evaluation is False, skipping this task 9396 1727204026.85612: _execute() done 9396 1727204026.85621: dumping result to json 9396 1727204026.85630: done dumping result, returning 9396 1727204026.85642: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 7 [12b410aa-8751-36c5-1f9e-000000000100] 9396 1727204026.85657: sending task result for task 12b410aa-8751-36c5-1f9e-000000000100 9396 1727204026.85907: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000100 9396 1727204026.85910: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 9396 1727204026.85952: no more pending results, returning what we have 9396 1727204026.85956: results queue empty 9396 1727204026.85957: checking for any_errors_fatal 9396 1727204026.85963: done checking for any_errors_fatal 9396 1727204026.85964: checking for max_fail_percentage 9396 1727204026.85966: done checking for max_fail_percentage 9396 1727204026.85967: checking to see if all hosts have failed and the running result is not ok 9396 1727204026.85968: done checking to see if all hosts have failed 9396 1727204026.85969: getting the remaining hosts for this loop 9396 1727204026.85970: done getting the remaining hosts for this loop 9396 1727204026.85974: getting the next task for host managed-node1 9396 1727204026.85980: done getting next task for host managed-node1 9396 1727204026.85983: ^ task is: TASK: Enable EPEL 8 9396 1727204026.85986: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.85991: getting variables 9396 1727204026.85993: in VariableManager get_vars() 9396 1727204026.86019: Calling all_inventory to load vars for managed-node1 9396 1727204026.86022: Calling groups_inventory to load vars for managed-node1 9396 1727204026.86026: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.86037: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.86040: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.86044: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.86318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.86595: done with get_vars() 9396 1727204026.86607: done getting variables 9396 1727204026.86671: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:53:46 -0400 (0:00:00.024) 0:00:02.838 ***** 9396 1727204026.86707: entering _queue_task() for managed-node1/command 9396 1727204026.86948: worker is 1 (out of 1 available) 9396 1727204026.86961: exiting _queue_task() for managed-node1/command 9396 1727204026.86973: done queuing things up, now waiting for results queue to drain 9396 1727204026.86975: waiting for pending results... 9396 1727204026.87409: running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 9396 1727204026.87414: in run() - task 12b410aa-8751-36c5-1f9e-000000000101 9396 1727204026.87418: variable 'ansible_search_path' from source: unknown 9396 1727204026.87421: variable 'ansible_search_path' from source: unknown 9396 1727204026.87425: calling self._execute() 9396 1727204026.87512: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.87528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.87550: variable 'omit' from source: magic vars 9396 1727204026.87982: variable 'ansible_distribution' from source: facts 9396 1727204026.88005: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 9396 1727204026.88013: when evaluation is False, skipping this task 9396 1727204026.88021: _execute() done 9396 1727204026.88030: dumping result to json 9396 1727204026.88040: done dumping result, returning 9396 1727204026.88052: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 8 [12b410aa-8751-36c5-1f9e-000000000101] 9396 1727204026.88065: sending task result for task 12b410aa-8751-36c5-1f9e-000000000101 skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 9396 1727204026.88239: no more pending results, returning what we have 9396 1727204026.88244: results queue empty 9396 1727204026.88245: checking for any_errors_fatal 9396 1727204026.88249: done checking for any_errors_fatal 9396 1727204026.88250: checking for max_fail_percentage 9396 1727204026.88252: done checking for max_fail_percentage 9396 1727204026.88253: checking to see if all hosts have failed and the running result is not ok 9396 1727204026.88254: done checking to see if all hosts have failed 9396 1727204026.88255: getting the remaining hosts for this loop 9396 1727204026.88256: done getting the remaining hosts for this loop 9396 1727204026.88261: getting the next task for host managed-node1 9396 1727204026.88272: done getting next task for host managed-node1 9396 1727204026.88275: ^ task is: TASK: Enable EPEL 6 9396 1727204026.88279: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.88284: getting variables 9396 1727204026.88287: in VariableManager get_vars() 9396 1727204026.88321: Calling all_inventory to load vars for managed-node1 9396 1727204026.88325: Calling groups_inventory to load vars for managed-node1 9396 1727204026.88330: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.88345: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.88350: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.88354: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.88769: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000101 9396 1727204026.88772: WORKER PROCESS EXITING 9396 1727204026.88801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.89075: done with get_vars() 9396 1727204026.89086: done getting variables 9396 1727204026.89153: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:53:46 -0400 (0:00:00.024) 0:00:02.862 ***** 9396 1727204026.89186: entering _queue_task() for managed-node1/copy 9396 1727204026.89413: worker is 1 (out of 1 available) 9396 1727204026.89427: exiting _queue_task() for managed-node1/copy 9396 1727204026.89439: done queuing things up, now waiting for results queue to drain 9396 1727204026.89441: waiting for pending results... 9396 1727204026.89681: running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 9396 1727204026.89815: in run() - task 12b410aa-8751-36c5-1f9e-000000000103 9396 1727204026.89838: variable 'ansible_search_path' from source: unknown 9396 1727204026.89848: variable 'ansible_search_path' from source: unknown 9396 1727204026.89888: calling self._execute() 9396 1727204026.89975: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.89988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.90008: variable 'omit' from source: magic vars 9396 1727204026.90434: variable 'ansible_distribution' from source: facts 9396 1727204026.90460: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 9396 1727204026.90469: when evaluation is False, skipping this task 9396 1727204026.90477: _execute() done 9396 1727204026.90486: dumping result to json 9396 1727204026.90497: done dumping result, returning 9396 1727204026.90695: done running TaskExecutor() for managed-node1/TASK: Enable EPEL 6 [12b410aa-8751-36c5-1f9e-000000000103] 9396 1727204026.90699: sending task result for task 12b410aa-8751-36c5-1f9e-000000000103 9396 1727204026.90766: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000103 9396 1727204026.90770: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 9396 1727204026.90811: no more pending results, returning what we have 9396 1727204026.90814: results queue empty 9396 1727204026.90815: checking for any_errors_fatal 9396 1727204026.90820: done checking for any_errors_fatal 9396 1727204026.90821: checking for max_fail_percentage 9396 1727204026.90822: done checking for max_fail_percentage 9396 1727204026.90823: checking to see if all hosts have failed and the running result is not ok 9396 1727204026.90824: done checking to see if all hosts have failed 9396 1727204026.90825: getting the remaining hosts for this loop 9396 1727204026.90827: done getting the remaining hosts for this loop 9396 1727204026.90830: getting the next task for host managed-node1 9396 1727204026.90839: done getting next task for host managed-node1 9396 1727204026.90842: ^ task is: TASK: Set network provider to 'nm' 9396 1727204026.90844: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.90848: getting variables 9396 1727204026.90849: in VariableManager get_vars() 9396 1727204026.90876: Calling all_inventory to load vars for managed-node1 9396 1727204026.90879: Calling groups_inventory to load vars for managed-node1 9396 1727204026.90883: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.90895: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.90898: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.90902: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.91144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.91414: done with get_vars() 9396 1727204026.91425: done getting variables 9396 1727204026.91491: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:13 Tuesday 24 September 2024 14:53:46 -0400 (0:00:00.023) 0:00:02.886 ***** 9396 1727204026.91521: entering _queue_task() for managed-node1/set_fact 9396 1727204026.91744: worker is 1 (out of 1 available) 9396 1727204026.91757: exiting _queue_task() for managed-node1/set_fact 9396 1727204026.91770: done queuing things up, now waiting for results queue to drain 9396 1727204026.91771: waiting for pending results... 9396 1727204026.92015: running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' 9396 1727204026.92197: in run() - task 12b410aa-8751-36c5-1f9e-000000000007 9396 1727204026.92201: variable 'ansible_search_path' from source: unknown 9396 1727204026.92204: calling self._execute() 9396 1727204026.92264: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.92277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.92297: variable 'omit' from source: magic vars 9396 1727204026.92422: variable 'omit' from source: magic vars 9396 1727204026.92467: variable 'omit' from source: magic vars 9396 1727204026.92517: variable 'omit' from source: magic vars 9396 1727204026.92567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204026.92614: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204026.92644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204026.92671: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204026.92690: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204026.92794: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204026.92798: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.92800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.92868: Set connection var ansible_timeout to 10 9396 1727204026.92880: Set connection var ansible_shell_executable to /bin/sh 9396 1727204026.92897: Set connection var ansible_pipelining to False 9396 1727204026.92911: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204026.92925: Set connection var ansible_connection to ssh 9396 1727204026.92932: Set connection var ansible_shell_type to sh 9396 1727204026.92964: variable 'ansible_shell_executable' from source: unknown 9396 1727204026.92993: variable 'ansible_connection' from source: unknown 9396 1727204026.92997: variable 'ansible_module_compression' from source: unknown 9396 1727204026.92999: variable 'ansible_shell_type' from source: unknown 9396 1727204026.93001: variable 'ansible_shell_executable' from source: unknown 9396 1727204026.93004: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.93006: variable 'ansible_pipelining' from source: unknown 9396 1727204026.93011: variable 'ansible_timeout' from source: unknown 9396 1727204026.93194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.93198: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204026.93208: variable 'omit' from source: magic vars 9396 1727204026.93219: starting attempt loop 9396 1727204026.93226: running the handler 9396 1727204026.93246: handler run complete 9396 1727204026.93261: attempt loop complete, returning result 9396 1727204026.93269: _execute() done 9396 1727204026.93276: dumping result to json 9396 1727204026.93285: done dumping result, returning 9396 1727204026.93299: done running TaskExecutor() for managed-node1/TASK: Set network provider to 'nm' [12b410aa-8751-36c5-1f9e-000000000007] 9396 1727204026.93314: sending task result for task 12b410aa-8751-36c5-1f9e-000000000007 9396 1727204026.93419: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000007 ok: [managed-node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 9396 1727204026.93554: no more pending results, returning what we have 9396 1727204026.93558: results queue empty 9396 1727204026.93559: checking for any_errors_fatal 9396 1727204026.93567: done checking for any_errors_fatal 9396 1727204026.93568: checking for max_fail_percentage 9396 1727204026.93570: done checking for max_fail_percentage 9396 1727204026.93571: checking to see if all hosts have failed and the running result is not ok 9396 1727204026.93572: done checking to see if all hosts have failed 9396 1727204026.93573: getting the remaining hosts for this loop 9396 1727204026.93575: done getting the remaining hosts for this loop 9396 1727204026.93581: getting the next task for host managed-node1 9396 1727204026.93591: done getting next task for host managed-node1 9396 1727204026.93594: ^ task is: TASK: meta (flush_handlers) 9396 1727204026.93596: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.93600: getting variables 9396 1727204026.93603: in VariableManager get_vars() 9396 1727204026.93635: Calling all_inventory to load vars for managed-node1 9396 1727204026.93639: Calling groups_inventory to load vars for managed-node1 9396 1727204026.93643: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.93656: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.93660: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.93665: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.94022: WORKER PROCESS EXITING 9396 1727204026.94047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.94305: done with get_vars() 9396 1727204026.94315: done getting variables 9396 1727204026.94386: in VariableManager get_vars() 9396 1727204026.94399: Calling all_inventory to load vars for managed-node1 9396 1727204026.94401: Calling groups_inventory to load vars for managed-node1 9396 1727204026.94405: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.94410: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.94413: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.94417: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.94599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.94854: done with get_vars() 9396 1727204026.94869: done queuing things up, now waiting for results queue to drain 9396 1727204026.94871: results queue empty 9396 1727204026.94872: checking for any_errors_fatal 9396 1727204026.94874: done checking for any_errors_fatal 9396 1727204026.94875: checking for max_fail_percentage 9396 1727204026.94877: done checking for max_fail_percentage 9396 1727204026.94878: checking to see if all hosts have failed and the running result is not ok 9396 1727204026.94879: done checking to see if all hosts have failed 9396 1727204026.94880: getting the remaining hosts for this loop 9396 1727204026.94881: done getting the remaining hosts for this loop 9396 1727204026.94883: getting the next task for host managed-node1 9396 1727204026.94888: done getting next task for host managed-node1 9396 1727204026.94891: ^ task is: TASK: meta (flush_handlers) 9396 1727204026.94893: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.94901: getting variables 9396 1727204026.94902: in VariableManager get_vars() 9396 1727204026.94911: Calling all_inventory to load vars for managed-node1 9396 1727204026.94914: Calling groups_inventory to load vars for managed-node1 9396 1727204026.94917: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.94922: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.94925: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.94929: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.95107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.95560: done with get_vars() 9396 1727204026.95569: done getting variables 9396 1727204026.95623: in VariableManager get_vars() 9396 1727204026.95632: Calling all_inventory to load vars for managed-node1 9396 1727204026.95635: Calling groups_inventory to load vars for managed-node1 9396 1727204026.95638: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.95643: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.95646: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.95649: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.95850: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.96108: done with get_vars() 9396 1727204026.96121: done queuing things up, now waiting for results queue to drain 9396 1727204026.96123: results queue empty 9396 1727204026.96124: checking for any_errors_fatal 9396 1727204026.96125: done checking for any_errors_fatal 9396 1727204026.96126: checking for max_fail_percentage 9396 1727204026.96127: done checking for max_fail_percentage 9396 1727204026.96128: checking to see if all hosts have failed and the running result is not ok 9396 1727204026.96129: done checking to see if all hosts have failed 9396 1727204026.96130: getting the remaining hosts for this loop 9396 1727204026.96131: done getting the remaining hosts for this loop 9396 1727204026.96134: getting the next task for host managed-node1 9396 1727204026.96137: done getting next task for host managed-node1 9396 1727204026.96138: ^ task is: None 9396 1727204026.96140: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.96141: done queuing things up, now waiting for results queue to drain 9396 1727204026.96142: results queue empty 9396 1727204026.96143: checking for any_errors_fatal 9396 1727204026.96144: done checking for any_errors_fatal 9396 1727204026.96145: checking for max_fail_percentage 9396 1727204026.96146: done checking for max_fail_percentage 9396 1727204026.96147: checking to see if all hosts have failed and the running result is not ok 9396 1727204026.96148: done checking to see if all hosts have failed 9396 1727204026.96150: getting the next task for host managed-node1 9396 1727204026.96152: done getting next task for host managed-node1 9396 1727204026.96153: ^ task is: None 9396 1727204026.96155: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.96204: in VariableManager get_vars() 9396 1727204026.96230: done with get_vars() 9396 1727204026.96237: in VariableManager get_vars() 9396 1727204026.96254: done with get_vars() 9396 1727204026.96259: variable 'omit' from source: magic vars 9396 1727204026.96297: in VariableManager get_vars() 9396 1727204026.96317: done with get_vars() 9396 1727204026.96344: variable 'omit' from source: magic vars PLAY [Play for testing bond device using deprecated 'master' argument] ********* 9396 1727204026.97239: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 9396 1727204026.97268: getting the remaining hosts for this loop 9396 1727204026.97270: done getting the remaining hosts for this loop 9396 1727204026.97273: getting the next task for host managed-node1 9396 1727204026.97276: done getting next task for host managed-node1 9396 1727204026.97278: ^ task is: TASK: Gathering Facts 9396 1727204026.97280: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204026.97282: getting variables 9396 1727204026.97284: in VariableManager get_vars() 9396 1727204026.97302: Calling all_inventory to load vars for managed-node1 9396 1727204026.97305: Calling groups_inventory to load vars for managed-node1 9396 1727204026.97308: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204026.97313: Calling all_plugins_play to load vars for managed-node1 9396 1727204026.97330: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204026.97334: Calling groups_plugins_play to load vars for managed-node1 9396 1727204026.97882: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204026.98303: done with get_vars() 9396 1727204026.98313: done getting variables 9396 1727204026.98420: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 Tuesday 24 September 2024 14:53:46 -0400 (0:00:00.069) 0:00:02.955 ***** 9396 1727204026.98448: entering _queue_task() for managed-node1/gather_facts 9396 1727204026.98798: worker is 1 (out of 1 available) 9396 1727204026.98811: exiting _queue_task() for managed-node1/gather_facts 9396 1727204026.98825: done queuing things up, now waiting for results queue to drain 9396 1727204026.98827: waiting for pending results... 9396 1727204026.99067: running TaskExecutor() for managed-node1/TASK: Gathering Facts 9396 1727204026.99140: in run() - task 12b410aa-8751-36c5-1f9e-000000000129 9396 1727204026.99153: variable 'ansible_search_path' from source: unknown 9396 1727204026.99194: calling self._execute() 9396 1727204026.99270: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.99277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.99289: variable 'omit' from source: magic vars 9396 1727204026.99665: variable 'ansible_distribution_major_version' from source: facts 9396 1727204026.99679: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204026.99685: variable 'omit' from source: magic vars 9396 1727204026.99713: variable 'omit' from source: magic vars 9396 1727204026.99743: variable 'omit' from source: magic vars 9396 1727204026.99779: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204026.99815: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204026.99834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204026.99853: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204026.99864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204026.99899: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204026.99903: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204026.99905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204026.99992: Set connection var ansible_timeout to 10 9396 1727204026.99999: Set connection var ansible_shell_executable to /bin/sh 9396 1727204027.00010: Set connection var ansible_pipelining to False 9396 1727204027.00021: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204027.00027: Set connection var ansible_connection to ssh 9396 1727204027.00030: Set connection var ansible_shell_type to sh 9396 1727204027.00054: variable 'ansible_shell_executable' from source: unknown 9396 1727204027.00057: variable 'ansible_connection' from source: unknown 9396 1727204027.00060: variable 'ansible_module_compression' from source: unknown 9396 1727204027.00062: variable 'ansible_shell_type' from source: unknown 9396 1727204027.00065: variable 'ansible_shell_executable' from source: unknown 9396 1727204027.00071: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204027.00073: variable 'ansible_pipelining' from source: unknown 9396 1727204027.00078: variable 'ansible_timeout' from source: unknown 9396 1727204027.00083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204027.00314: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204027.00318: variable 'omit' from source: magic vars 9396 1727204027.00321: starting attempt loop 9396 1727204027.00324: running the handler 9396 1727204027.00337: variable 'ansible_facts' from source: unknown 9396 1727204027.00356: _low_level_execute_command(): starting 9396 1727204027.00364: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204027.01339: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204027.01344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204027.01347: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204027.01349: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204027.01508: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204027.01551: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204027.03237: stdout chunk (state=3): >>>/root <<< 9396 1727204027.03338: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204027.03410: stderr chunk (state=3): >>><<< 9396 1727204027.03414: stdout chunk (state=3): >>><<< 9396 1727204027.03432: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204027.03445: _low_level_execute_command(): starting 9396 1727204027.03457: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908 `" && echo ansible-tmp-1727204027.0343304-10037-194987115302908="` echo /root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908 `" ) && sleep 0' 9396 1727204027.03943: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204027.03947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204027.03949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204027.03958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204027.04030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204027.04034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204027.04037: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204027.04102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204027.06126: stdout chunk (state=3): >>>ansible-tmp-1727204027.0343304-10037-194987115302908=/root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908 <<< 9396 1727204027.06362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204027.06365: stdout chunk (state=3): >>><<< 9396 1727204027.06368: stderr chunk (state=3): >>><<< 9396 1727204027.06388: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204027.0343304-10037-194987115302908=/root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204027.06432: variable 'ansible_module_compression' from source: unknown 9396 1727204027.06508: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 9396 1727204027.06794: variable 'ansible_facts' from source: unknown 9396 1727204027.06798: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908/AnsiballZ_setup.py 9396 1727204027.07073: Sending initial data 9396 1727204027.07076: Sent initial data (153 bytes) 9396 1727204027.08007: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204027.08175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204027.08200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204027.08279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204027.10005: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 9396 1727204027.10033: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204027.10092: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204027.10157: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpgwi2r4x6 /root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908/AnsiballZ_setup.py <<< 9396 1727204027.10180: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908/AnsiballZ_setup.py" <<< 9396 1727204027.10225: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpgwi2r4x6" to remote "/root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908/AnsiballZ_setup.py" <<< 9396 1727204027.14757: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204027.14762: stdout chunk (state=3): >>><<< 9396 1727204027.14765: stderr chunk (state=3): >>><<< 9396 1727204027.14768: done transferring module to remote 9396 1727204027.14781: _low_level_execute_command(): starting 9396 1727204027.14798: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908/ /root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908/AnsiballZ_setup.py && sleep 0' 9396 1727204027.15864: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204027.15914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204027.15943: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204027.16030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204027.16072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204027.16113: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204027.16177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204027.18277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204027.18352: stderr chunk (state=3): >>><<< 9396 1727204027.18371: stdout chunk (state=3): >>><<< 9396 1727204027.18718: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204027.18723: _low_level_execute_command(): starting 9396 1727204027.18726: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908/AnsiballZ_setup.py && sleep 0' 9396 1727204027.19727: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204027.19907: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204027.20104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204027.20307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204027.20426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204027.90826: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec295f006a273ed037dc1f179c04a840", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2862, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 855, "free": 2862}, "nocache": {"free": 3483, "used": 234}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_uuid": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"<<< 9396 1727204027.90868: stdout chunk (state=3): >>>]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 517, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251157766144, "block_size": 4096, "block_total": 64479564, "block_available": 61317814, "block_used": 3161750, "inode_total": 16384000, "inode_available": 16302271, "inode_used": 81729, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPMunxA14oEo4zS2fFpjhbXGrPN+Pc6yYvB7bHw1sbQmdYiLO2rxhswFBK4xGCi3vLH7aJpii+0+X9RSsTnEBO0RK8RoPR2xkDNYXW0a1/AwnAaak2bYj0nyDuqJg37AS/oUOb1vdeV3ZYBHlx2BeYopb3qrr4hWKyEoB0Cj4GUfAAAAFQDdJ5ecc5ETygm9XhUUj7x91BVbMwAAAIEA3LP6y0wGAOTGTdHT9uHV9ZnnWPry3FD498XUhfd/8katSmv9dBqFZ5BSlmylNhNOGN/dgGvIysah3TjyiVgAhMIDSxyWXeNKylfCPrSiGgzM8sPtvUHAKjCr4YeqDBRpE2nuYpznop73ZNQ+pIgZqdMFXs4mhUw8Ai2Xc/5SU50AAACAdpxXHRObj5kiSZgGAlvwkslKIUteCSyoGibmNskfiBNzJY3St95HEK+e2xMQboTkyY3hrBqloLoBzSVdWeHcA4Dy35X8VTnKqPND6sF4ZlGbbCJ4i7j+NZNvY9YE7WlAvhx04nuXVm8WrfYDcMosawu6xUqwt2jyqxyZoW4D7vM=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDPMZg3FopBCtnsfqhDL1jxsml2miVdyRUl9+VLqXB9qIsKgdugZ5C55RjTiQ22wsa2Eh+b3yyIsvxFblaHxpQzIRnwsGaUu0F4hFCZXlaF5fc3O+m2QULrdxgV3MgQyWL48mVBiOB+GPPbs7QmzI86NB7uKRLNDd/a1TptTIakCXZG8IzEbICTS8L5pdUZ9xNLEft03pnuhGY/GBZ92mu+wYkGzptfYkjUD3tquOMvoARgvTTX7p7aXxFfSocK0lHZFLYlKMJRVt54wVcnxHlL5CKemFAnA9S+D8LZ5wZeoUJSE/kn/yvpO8XUisytKZyOmRFK/G3+cUWh6Pd7qYdOok4cofVXlyuTOw2mI5oI0U9r4iP+OK/lmhxRulzsX5W/l0DkEwkU1RuSRlvwu5f9EnfGb0i2T/oUti+RAH1DryJz8HsYqwWh73E/eQA3Syq8QjnIsYEPvoPNycnSAARw/gUeutemNV7jA6AoH96WGXVckMWfsvjlKleAA9RzgPc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEVvjAxgU/tcZsEFgZpofCf1HGzA0uJ8gV8vzQgeRwbkH+VzA0Knn/ZxPSYccws9QCPb2xcCpDQhETAIG5RKHc=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAbPQMlsrcdV/DtPY+pi9Fcm1KJrxSB0LaYtXxUu2kxn", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_hostnqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50712 10.31.11.210 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50712 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "ne<<< 9396 1727204027.90882: stdout chunk (state=3): >>>twork": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d080:f60d:659:9515", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.210"], "ansible_all_ipv6_addresses": ["fe80::d080:f60d:659:9515"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.210", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d080:f60d:659:9515"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.68310546875, "5m": 0.3642578125, "15m": 0.1630859375}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "47", "epoch": "1727204027", "epoch_int": "1727204027", "date": "2024-09-24", "time": "14:53:47", "iso8601_micro": "2024-09-24T18:53:47.905223Z", "iso8601": "2024-09-24T18:53:47Z", "iso8601_basic": "20240924T145347905223", "iso8601_basic_short": "20240924T145347", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 9396 1727204027.92941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204027.93004: stderr chunk (state=3): >>><<< 9396 1727204027.93011: stdout chunk (state=3): >>><<< 9396 1727204027.93044: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node1", "ansible_hostname": "managed-node1", "ansible_nodename": "managed-node1", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec295f006a273ed037dc1f179c04a840", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2862, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 855, "free": 2862}, "nocache": {"free": 3483, "used": 234}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_uuid": "ec295f00-6a27-3ed0-37dc-1f179c04a840", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 517, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251157766144, "block_size": 4096, "block_total": 64479564, "block_available": 61317814, "block_used": 3161750, "inode_total": 16384000, "inode_available": 16302271, "inode_used": 81729, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_iscsi_iqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_is_chroot": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBAPMunxA14oEo4zS2fFpjhbXGrPN+Pc6yYvB7bHw1sbQmdYiLO2rxhswFBK4xGCi3vLH7aJpii+0+X9RSsTnEBO0RK8RoPR2xkDNYXW0a1/AwnAaak2bYj0nyDuqJg37AS/oUOb1vdeV3ZYBHlx2BeYopb3qrr4hWKyEoB0Cj4GUfAAAAFQDdJ5ecc5ETygm9XhUUj7x91BVbMwAAAIEA3LP6y0wGAOTGTdHT9uHV9ZnnWPry3FD498XUhfd/8katSmv9dBqFZ5BSlmylNhNOGN/dgGvIysah3TjyiVgAhMIDSxyWXeNKylfCPrSiGgzM8sPtvUHAKjCr4YeqDBRpE2nuYpznop73ZNQ+pIgZqdMFXs4mhUw8Ai2Xc/5SU50AAACAdpxXHRObj5kiSZgGAlvwkslKIUteCSyoGibmNskfiBNzJY3St95HEK+e2xMQboTkyY3hrBqloLoBzSVdWeHcA4Dy35X8VTnKqPND6sF4ZlGbbCJ4i7j+NZNvY9YE7WlAvhx04nuXVm8WrfYDcMosawu6xUqwt2jyqxyZoW4D7vM=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQDPMZg3FopBCtnsfqhDL1jxsml2miVdyRUl9+VLqXB9qIsKgdugZ5C55RjTiQ22wsa2Eh+b3yyIsvxFblaHxpQzIRnwsGaUu0F4hFCZXlaF5fc3O+m2QULrdxgV3MgQyWL48mVBiOB+GPPbs7QmzI86NB7uKRLNDd/a1TptTIakCXZG8IzEbICTS8L5pdUZ9xNLEft03pnuhGY/GBZ92mu+wYkGzptfYkjUD3tquOMvoARgvTTX7p7aXxFfSocK0lHZFLYlKMJRVt54wVcnxHlL5CKemFAnA9S+D8LZ5wZeoUJSE/kn/yvpO8XUisytKZyOmRFK/G3+cUWh6Pd7qYdOok4cofVXlyuTOw2mI5oI0U9r4iP+OK/lmhxRulzsX5W/l0DkEwkU1RuSRlvwu5f9EnfGb0i2T/oUti+RAH1DryJz8HsYqwWh73E/eQA3Syq8QjnIsYEPvoPNycnSAARw/gUeutemNV7jA6AoH96WGXVckMWfsvjlKleAA9RzgPc=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBDEVvjAxgU/tcZsEFgZpofCf1HGzA0uJ8gV8vzQgeRwbkH+VzA0Knn/ZxPSYccws9QCPb2xcCpDQhETAIG5RKHc=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIAbPQMlsrcdV/DtPY+pi9Fcm1KJrxSB0LaYtXxUu2kxn", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_lsb": {}, "ansible_hostnqn": "", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 50712 10.31.11.210 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 50712 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_eth0": {"device": "eth0", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::d080:f60d:659:9515", "prefix": "64", "scope": "link"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.11.210", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:d4:45:6e:f8:dd", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.11.210"], "ansible_all_ipv6_addresses": ["fe80::d080:f60d:659:9515"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.11.210", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::d080:f60d:659:9515"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_loadavg": {"1m": 0.68310546875, "5m": 0.3642578125, "15m": 0.1630859375}, "ansible_fibre_channel_wwn": [], "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "53", "second": "47", "epoch": "1727204027", "epoch_int": "1727204027", "date": "2024-09-24", "time": "14:53:47", "iso8601_micro": "2024-09-24T18:53:47.905223Z", "iso8601": "2024-09-24T18:53:47Z", "iso8601_basic": "20240924T145347905223", "iso8601_basic_short": "20240924T145347", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fips": false, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204027.93282: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204027.93303: _low_level_execute_command(): starting 9396 1727204027.93311: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204027.0343304-10037-194987115302908/ > /dev/null 2>&1 && sleep 0' 9396 1727204027.93801: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204027.93805: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204027.93807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204027.93810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204027.93865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204027.93869: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204027.93919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204027.95873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204027.95922: stderr chunk (state=3): >>><<< 9396 1727204027.95926: stdout chunk (state=3): >>><<< 9396 1727204027.95941: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204027.95951: handler run complete 9396 1727204027.96058: variable 'ansible_facts' from source: unknown 9396 1727204027.96141: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204027.96392: variable 'ansible_facts' from source: unknown 9396 1727204027.96464: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204027.96571: attempt loop complete, returning result 9396 1727204027.96575: _execute() done 9396 1727204027.96579: dumping result to json 9396 1727204027.96600: done dumping result, returning 9396 1727204027.96608: done running TaskExecutor() for managed-node1/TASK: Gathering Facts [12b410aa-8751-36c5-1f9e-000000000129] 9396 1727204027.96617: sending task result for task 12b410aa-8751-36c5-1f9e-000000000129 ok: [managed-node1] 9396 1727204027.97158: no more pending results, returning what we have 9396 1727204027.97161: results queue empty 9396 1727204027.97162: checking for any_errors_fatal 9396 1727204027.97163: done checking for any_errors_fatal 9396 1727204027.97163: checking for max_fail_percentage 9396 1727204027.97164: done checking for max_fail_percentage 9396 1727204027.97165: checking to see if all hosts have failed and the running result is not ok 9396 1727204027.97165: done checking to see if all hosts have failed 9396 1727204027.97166: getting the remaining hosts for this loop 9396 1727204027.97167: done getting the remaining hosts for this loop 9396 1727204027.97169: getting the next task for host managed-node1 9396 1727204027.97174: done getting next task for host managed-node1 9396 1727204027.97175: ^ task is: TASK: meta (flush_handlers) 9396 1727204027.97176: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204027.97179: getting variables 9396 1727204027.97180: in VariableManager get_vars() 9396 1727204027.97207: Calling all_inventory to load vars for managed-node1 9396 1727204027.97209: Calling groups_inventory to load vars for managed-node1 9396 1727204027.97211: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204027.97226: Calling all_plugins_play to load vars for managed-node1 9396 1727204027.97228: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204027.97232: Calling groups_plugins_play to load vars for managed-node1 9396 1727204027.97364: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000129 9396 1727204027.97368: WORKER PROCESS EXITING 9396 1727204027.97380: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204027.97540: done with get_vars() 9396 1727204027.97551: done getting variables 9396 1727204027.97626: in VariableManager get_vars() 9396 1727204027.97643: Calling all_inventory to load vars for managed-node1 9396 1727204027.97646: Calling groups_inventory to load vars for managed-node1 9396 1727204027.97649: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204027.97654: Calling all_plugins_play to load vars for managed-node1 9396 1727204027.97657: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204027.97661: Calling groups_plugins_play to load vars for managed-node1 9396 1727204027.97787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204027.97938: done with get_vars() 9396 1727204027.97949: done queuing things up, now waiting for results queue to drain 9396 1727204027.97950: results queue empty 9396 1727204027.97951: checking for any_errors_fatal 9396 1727204027.97953: done checking for any_errors_fatal 9396 1727204027.97954: checking for max_fail_percentage 9396 1727204027.97955: done checking for max_fail_percentage 9396 1727204027.97955: checking to see if all hosts have failed and the running result is not ok 9396 1727204027.97959: done checking to see if all hosts have failed 9396 1727204027.97959: getting the remaining hosts for this loop 9396 1727204027.97960: done getting the remaining hosts for this loop 9396 1727204027.97962: getting the next task for host managed-node1 9396 1727204027.97965: done getting next task for host managed-node1 9396 1727204027.97966: ^ task is: TASK: INIT Prepare setup 9396 1727204027.97967: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204027.97969: getting variables 9396 1727204027.97969: in VariableManager get_vars() 9396 1727204027.97978: Calling all_inventory to load vars for managed-node1 9396 1727204027.97980: Calling groups_inventory to load vars for managed-node1 9396 1727204027.97981: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204027.97985: Calling all_plugins_play to load vars for managed-node1 9396 1727204027.97987: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204027.97990: Calling groups_plugins_play to load vars for managed-node1 9396 1727204027.98098: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204027.98268: done with get_vars() 9396 1727204027.98279: done getting variables 9396 1727204027.98382: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:15 Tuesday 24 September 2024 14:53:47 -0400 (0:00:00.999) 0:00:03.955 ***** 9396 1727204027.98419: entering _queue_task() for managed-node1/debug 9396 1727204027.98422: Creating lock for debug 9396 1727204027.98742: worker is 1 (out of 1 available) 9396 1727204027.98754: exiting _queue_task() for managed-node1/debug 9396 1727204027.98766: done queuing things up, now waiting for results queue to drain 9396 1727204027.98768: waiting for pending results... 9396 1727204027.98935: running TaskExecutor() for managed-node1/TASK: INIT Prepare setup 9396 1727204027.99004: in run() - task 12b410aa-8751-36c5-1f9e-00000000000b 9396 1727204027.99120: variable 'ansible_search_path' from source: unknown 9396 1727204027.99130: calling self._execute() 9396 1727204027.99455: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204027.99486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204027.99521: variable 'omit' from source: magic vars 9396 1727204028.00108: variable 'ansible_distribution_major_version' from source: facts 9396 1727204028.00122: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204028.00132: variable 'omit' from source: magic vars 9396 1727204028.00166: variable 'omit' from source: magic vars 9396 1727204028.00294: variable 'omit' from source: magic vars 9396 1727204028.00330: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204028.00383: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204028.00402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204028.00422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204028.00452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204028.00513: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204028.00559: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204028.00562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204028.00661: Set connection var ansible_timeout to 10 9396 1727204028.00665: Set connection var ansible_shell_executable to /bin/sh 9396 1727204028.00676: Set connection var ansible_pipelining to False 9396 1727204028.00682: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204028.00690: Set connection var ansible_connection to ssh 9396 1727204028.00693: Set connection var ansible_shell_type to sh 9396 1727204028.00719: variable 'ansible_shell_executable' from source: unknown 9396 1727204028.00723: variable 'ansible_connection' from source: unknown 9396 1727204028.00726: variable 'ansible_module_compression' from source: unknown 9396 1727204028.00729: variable 'ansible_shell_type' from source: unknown 9396 1727204028.00731: variable 'ansible_shell_executable' from source: unknown 9396 1727204028.00735: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204028.00741: variable 'ansible_pipelining' from source: unknown 9396 1727204028.00744: variable 'ansible_timeout' from source: unknown 9396 1727204028.00750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204028.00867: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204028.00882: variable 'omit' from source: magic vars 9396 1727204028.00891: starting attempt loop 9396 1727204028.00894: running the handler 9396 1727204028.00935: handler run complete 9396 1727204028.00956: attempt loop complete, returning result 9396 1727204028.00959: _execute() done 9396 1727204028.00962: dumping result to json 9396 1727204028.00966: done dumping result, returning 9396 1727204028.00978: done running TaskExecutor() for managed-node1/TASK: INIT Prepare setup [12b410aa-8751-36c5-1f9e-00000000000b] 9396 1727204028.00985: sending task result for task 12b410aa-8751-36c5-1f9e-00000000000b 9396 1727204028.01091: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000000b 9396 1727204028.01094: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: ################################################## 9396 1727204028.01332: no more pending results, returning what we have 9396 1727204028.01336: results queue empty 9396 1727204028.01337: checking for any_errors_fatal 9396 1727204028.01339: done checking for any_errors_fatal 9396 1727204028.01340: checking for max_fail_percentage 9396 1727204028.01342: done checking for max_fail_percentage 9396 1727204028.01343: checking to see if all hosts have failed and the running result is not ok 9396 1727204028.01344: done checking to see if all hosts have failed 9396 1727204028.01345: getting the remaining hosts for this loop 9396 1727204028.01346: done getting the remaining hosts for this loop 9396 1727204028.01351: getting the next task for host managed-node1 9396 1727204028.01358: done getting next task for host managed-node1 9396 1727204028.01362: ^ task is: TASK: Install dnsmasq 9396 1727204028.01365: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204028.01368: getting variables 9396 1727204028.01370: in VariableManager get_vars() 9396 1727204028.01532: Calling all_inventory to load vars for managed-node1 9396 1727204028.01535: Calling groups_inventory to load vars for managed-node1 9396 1727204028.01538: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204028.01549: Calling all_plugins_play to load vars for managed-node1 9396 1727204028.01553: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204028.01557: Calling groups_plugins_play to load vars for managed-node1 9396 1727204028.01781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204028.02093: done with get_vars() 9396 1727204028.02104: done getting variables 9396 1727204028.02172: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:53:48 -0400 (0:00:00.037) 0:00:03.993 ***** 9396 1727204028.02207: entering _queue_task() for managed-node1/package 9396 1727204028.02700: worker is 1 (out of 1 available) 9396 1727204028.02713: exiting _queue_task() for managed-node1/package 9396 1727204028.02722: done queuing things up, now waiting for results queue to drain 9396 1727204028.02724: waiting for pending results... 9396 1727204028.02855: running TaskExecutor() for managed-node1/TASK: Install dnsmasq 9396 1727204028.02953: in run() - task 12b410aa-8751-36c5-1f9e-00000000000f 9396 1727204028.02975: variable 'ansible_search_path' from source: unknown 9396 1727204028.02983: variable 'ansible_search_path' from source: unknown 9396 1727204028.03032: calling self._execute() 9396 1727204028.03133: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204028.03148: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204028.03280: variable 'omit' from source: magic vars 9396 1727204028.03633: variable 'ansible_distribution_major_version' from source: facts 9396 1727204028.03654: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204028.03666: variable 'omit' from source: magic vars 9396 1727204028.03738: variable 'omit' from source: magic vars 9396 1727204028.04001: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204028.06053: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204028.06108: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204028.06157: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204028.06184: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204028.06210: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204028.06298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204028.06327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204028.06350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204028.06385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204028.06399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204028.06492: variable '__network_is_ostree' from source: set_fact 9396 1727204028.06495: variable 'omit' from source: magic vars 9396 1727204028.06526: variable 'omit' from source: magic vars 9396 1727204028.06552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204028.06576: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204028.06592: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204028.06610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204028.06623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204028.06651: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204028.06654: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204028.06657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204028.06743: Set connection var ansible_timeout to 10 9396 1727204028.06749: Set connection var ansible_shell_executable to /bin/sh 9396 1727204028.06760: Set connection var ansible_pipelining to False 9396 1727204028.06767: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204028.06775: Set connection var ansible_connection to ssh 9396 1727204028.06778: Set connection var ansible_shell_type to sh 9396 1727204028.06806: variable 'ansible_shell_executable' from source: unknown 9396 1727204028.06830: variable 'ansible_connection' from source: unknown 9396 1727204028.06834: variable 'ansible_module_compression' from source: unknown 9396 1727204028.06836: variable 'ansible_shell_type' from source: unknown 9396 1727204028.06839: variable 'ansible_shell_executable' from source: unknown 9396 1727204028.06841: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204028.06843: variable 'ansible_pipelining' from source: unknown 9396 1727204028.06845: variable 'ansible_timeout' from source: unknown 9396 1727204028.06847: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204028.07005: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204028.07008: variable 'omit' from source: magic vars 9396 1727204028.07011: starting attempt loop 9396 1727204028.07013: running the handler 9396 1727204028.07015: variable 'ansible_facts' from source: unknown 9396 1727204028.07018: variable 'ansible_facts' from source: unknown 9396 1727204028.07205: _low_level_execute_command(): starting 9396 1727204028.07208: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204028.07762: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204028.07767: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204028.07795: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204028.07798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204028.07807: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204028.07872: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204028.07878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204028.07884: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204028.07887: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 9396 1727204028.07893: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 9396 1727204028.07896: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204028.07898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204028.07900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204028.07903: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204028.07905: stderr chunk (state=3): >>>debug2: match found <<< 9396 1727204028.07917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204028.07994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204028.08007: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204028.08049: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204028.08101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204028.09879: stdout chunk (state=3): >>>/root <<< 9396 1727204028.10082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204028.10085: stdout chunk (state=3): >>><<< 9396 1727204028.10088: stderr chunk (state=3): >>><<< 9396 1727204028.10111: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204028.10130: _low_level_execute_command(): starting 9396 1727204028.10221: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669 `" && echo ansible-tmp-1727204028.1011796-10071-37845794627669="` echo /root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669 `" ) && sleep 0' 9396 1727204028.10769: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204028.10848: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204028.10871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204028.10924: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204028.10944: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204028.10988: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204028.11092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204028.13151: stdout chunk (state=3): >>>ansible-tmp-1727204028.1011796-10071-37845794627669=/root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669 <<< 9396 1727204028.13399: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204028.13402: stdout chunk (state=3): >>><<< 9396 1727204028.13405: stderr chunk (state=3): >>><<< 9396 1727204028.13411: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204028.1011796-10071-37845794627669=/root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204028.13525: variable 'ansible_module_compression' from source: unknown 9396 1727204028.13546: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 9396 1727204028.13555: ANSIBALLZ: Acquiring lock 9396 1727204028.13564: ANSIBALLZ: Lock acquired: 139797141880704 9396 1727204028.13573: ANSIBALLZ: Creating module 9396 1727204028.35336: ANSIBALLZ: Writing module into payload 9396 1727204028.35613: ANSIBALLZ: Writing module 9396 1727204028.35639: ANSIBALLZ: Renaming module 9396 1727204028.35644: ANSIBALLZ: Done creating module 9396 1727204028.35661: variable 'ansible_facts' from source: unknown 9396 1727204028.35726: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669/AnsiballZ_dnf.py 9396 1727204028.35848: Sending initial data 9396 1727204028.35852: Sent initial data (150 bytes) 9396 1727204028.36309: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204028.36313: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204028.36316: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204028.36318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204028.36374: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204028.36378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204028.36459: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204028.38200: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204028.38249: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204028.38298: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpzwgid0j3 /root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669/AnsiballZ_dnf.py <<< 9396 1727204028.38323: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669/AnsiballZ_dnf.py" <<< 9396 1727204028.38367: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpzwgid0j3" to remote "/root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669/AnsiballZ_dnf.py" <<< 9396 1727204028.38371: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669/AnsiballZ_dnf.py" <<< 9396 1727204028.39684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204028.39732: stderr chunk (state=3): >>><<< 9396 1727204028.39747: stdout chunk (state=3): >>><<< 9396 1727204028.39810: done transferring module to remote 9396 1727204028.39814: _low_level_execute_command(): starting 9396 1727204028.39817: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669/ /root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669/AnsiballZ_dnf.py && sleep 0' 9396 1727204028.40463: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204028.40468: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204028.40516: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204028.40591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204028.42517: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204028.42563: stderr chunk (state=3): >>><<< 9396 1727204028.42568: stdout chunk (state=3): >>><<< 9396 1727204028.42610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204028.42614: _low_level_execute_command(): starting 9396 1727204028.42624: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669/AnsiballZ_dnf.py && sleep 0' 9396 1727204028.43173: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204028.43236: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204028.43239: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204028.43283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204031.27750: stdout chunk (state=3): >>> {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-1.fc39.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 9396 1727204031.33970: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204031.34057: stderr chunk (state=3): >>><<< 9396 1727204031.34276: stdout chunk (state=3): >>><<< 9396 1727204031.34279: _low_level_execute_command() done: rc=0, stdout= {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-1.fc39.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204031.34287: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204031.34291: _low_level_execute_command(): starting 9396 1727204031.34294: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204028.1011796-10071-37845794627669/ > /dev/null 2>&1 && sleep 0' 9396 1727204031.35560: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204031.35613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204031.35631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204031.35657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204031.35811: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204031.35876: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204031.36017: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204031.36050: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204031.38242: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204031.38260: stdout chunk (state=3): >>><<< 9396 1727204031.38274: stderr chunk (state=3): >>><<< 9396 1727204031.38472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204031.38475: handler run complete 9396 1727204031.38855: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204031.39305: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204031.39549: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204031.39596: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204031.39639: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204031.39744: variable '__install_status' from source: unknown 9396 1727204031.39905: Evaluated conditional (__install_status is success): True 9396 1727204031.39936: attempt loop complete, returning result 9396 1727204031.39994: _execute() done 9396 1727204031.40004: dumping result to json 9396 1727204031.40020: done dumping result, returning 9396 1727204031.40047: done running TaskExecutor() for managed-node1/TASK: Install dnsmasq [12b410aa-8751-36c5-1f9e-00000000000f] 9396 1727204031.40105: sending task result for task 12b410aa-8751-36c5-1f9e-00000000000f changed: [managed-node1] => { "attempts": 1, "changed": true, "rc": 0, "results": [ "Installed: dnsmasq-2.90-1.fc39.x86_64" ] } 9396 1727204031.40500: no more pending results, returning what we have 9396 1727204031.40505: results queue empty 9396 1727204031.40506: checking for any_errors_fatal 9396 1727204031.40514: done checking for any_errors_fatal 9396 1727204031.40515: checking for max_fail_percentage 9396 1727204031.40516: done checking for max_fail_percentage 9396 1727204031.40517: checking to see if all hosts have failed and the running result is not ok 9396 1727204031.40518: done checking to see if all hosts have failed 9396 1727204031.40519: getting the remaining hosts for this loop 9396 1727204031.40520: done getting the remaining hosts for this loop 9396 1727204031.40525: getting the next task for host managed-node1 9396 1727204031.40532: done getting next task for host managed-node1 9396 1727204031.40535: ^ task is: TASK: Install pgrep, sysctl 9396 1727204031.40538: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204031.40541: getting variables 9396 1727204031.40543: in VariableManager get_vars() 9396 1727204031.40587: Calling all_inventory to load vars for managed-node1 9396 1727204031.40793: Calling groups_inventory to load vars for managed-node1 9396 1727204031.40798: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204031.40810: Calling all_plugins_play to load vars for managed-node1 9396 1727204031.40814: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204031.40820: Calling groups_plugins_play to load vars for managed-node1 9396 1727204031.41164: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000000f 9396 1727204031.41168: WORKER PROCESS EXITING 9396 1727204031.41186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204031.41773: done with get_vars() 9396 1727204031.41786: done getting variables 9396 1727204031.42183: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Tuesday 24 September 2024 14:53:51 -0400 (0:00:03.400) 0:00:07.393 ***** 9396 1727204031.42227: entering _queue_task() for managed-node1/package 9396 1727204031.43126: worker is 1 (out of 1 available) 9396 1727204031.43135: exiting _queue_task() for managed-node1/package 9396 1727204031.43149: done queuing things up, now waiting for results queue to drain 9396 1727204031.43150: waiting for pending results... 9396 1727204031.43328: running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl 9396 1727204031.43819: in run() - task 12b410aa-8751-36c5-1f9e-000000000010 9396 1727204031.43824: variable 'ansible_search_path' from source: unknown 9396 1727204031.43828: variable 'ansible_search_path' from source: unknown 9396 1727204031.43831: calling self._execute() 9396 1727204031.43973: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204031.43996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204031.44017: variable 'omit' from source: magic vars 9396 1727204031.44504: variable 'ansible_distribution_major_version' from source: facts 9396 1727204031.44526: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204031.44701: variable 'ansible_os_family' from source: facts 9396 1727204031.44717: Evaluated conditional (ansible_os_family == 'RedHat'): True 9396 1727204031.44958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204031.45371: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204031.45433: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204031.45484: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204031.45537: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204031.45645: variable 'ansible_distribution_major_version' from source: facts 9396 1727204031.45664: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 9396 1727204031.45673: when evaluation is False, skipping this task 9396 1727204031.45681: _execute() done 9396 1727204031.45701: dumping result to json 9396 1727204031.45797: done dumping result, returning 9396 1727204031.45802: done running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl [12b410aa-8751-36c5-1f9e-000000000010] 9396 1727204031.45805: sending task result for task 12b410aa-8751-36c5-1f9e-000000000010 9396 1727204031.45882: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000010 9396 1727204031.45885: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 9396 1727204031.45958: no more pending results, returning what we have 9396 1727204031.45963: results queue empty 9396 1727204031.45965: checking for any_errors_fatal 9396 1727204031.45973: done checking for any_errors_fatal 9396 1727204031.45974: checking for max_fail_percentage 9396 1727204031.45976: done checking for max_fail_percentage 9396 1727204031.45977: checking to see if all hosts have failed and the running result is not ok 9396 1727204031.45978: done checking to see if all hosts have failed 9396 1727204031.45979: getting the remaining hosts for this loop 9396 1727204031.45981: done getting the remaining hosts for this loop 9396 1727204031.45986: getting the next task for host managed-node1 9396 1727204031.45996: done getting next task for host managed-node1 9396 1727204031.46191: ^ task is: TASK: Install pgrep, sysctl 9396 1727204031.46196: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204031.46200: getting variables 9396 1727204031.46202: in VariableManager get_vars() 9396 1727204031.46244: Calling all_inventory to load vars for managed-node1 9396 1727204031.46248: Calling groups_inventory to load vars for managed-node1 9396 1727204031.46251: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204031.46262: Calling all_plugins_play to load vars for managed-node1 9396 1727204031.46265: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204031.46269: Calling groups_plugins_play to load vars for managed-node1 9396 1727204031.46562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204031.46867: done with get_vars() 9396 1727204031.46882: done getting variables 9396 1727204031.46953: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Tuesday 24 September 2024 14:53:51 -0400 (0:00:00.047) 0:00:07.440 ***** 9396 1727204031.46994: entering _queue_task() for managed-node1/package 9396 1727204031.47429: worker is 1 (out of 1 available) 9396 1727204031.47441: exiting _queue_task() for managed-node1/package 9396 1727204031.47455: done queuing things up, now waiting for results queue to drain 9396 1727204031.47457: waiting for pending results... 9396 1727204031.47767: running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl 9396 1727204031.47848: in run() - task 12b410aa-8751-36c5-1f9e-000000000011 9396 1727204031.47869: variable 'ansible_search_path' from source: unknown 9396 1727204031.47878: variable 'ansible_search_path' from source: unknown 9396 1727204031.47947: calling self._execute() 9396 1727204031.48042: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204031.48166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204031.48169: variable 'omit' from source: magic vars 9396 1727204031.48573: variable 'ansible_distribution_major_version' from source: facts 9396 1727204031.48596: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204031.48812: variable 'ansible_os_family' from source: facts 9396 1727204031.48834: Evaluated conditional (ansible_os_family == 'RedHat'): True 9396 1727204031.49066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204031.49528: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204031.49595: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204031.49641: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204031.49684: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204031.49781: variable 'ansible_distribution_major_version' from source: facts 9396 1727204031.49811: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 9396 1727204031.49923: variable 'omit' from source: magic vars 9396 1727204031.49926: variable 'omit' from source: magic vars 9396 1727204031.50126: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204031.54796: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204031.54846: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204031.54887: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204031.55190: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204031.55196: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204031.55279: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204031.55319: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204031.55377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204031.55428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204031.55517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204031.55626: variable '__network_is_ostree' from source: set_fact 9396 1727204031.55629: variable 'omit' from source: magic vars 9396 1727204031.55635: variable 'omit' from source: magic vars 9396 1727204031.55667: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204031.55702: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204031.55723: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204031.55826: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204031.55829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204031.55842: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204031.55845: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204031.55851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204031.56087: Set connection var ansible_timeout to 10 9396 1727204031.56223: Set connection var ansible_shell_executable to /bin/sh 9396 1727204031.56236: Set connection var ansible_pipelining to False 9396 1727204031.56244: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204031.56251: Set connection var ansible_connection to ssh 9396 1727204031.56254: Set connection var ansible_shell_type to sh 9396 1727204031.56290: variable 'ansible_shell_executable' from source: unknown 9396 1727204031.56294: variable 'ansible_connection' from source: unknown 9396 1727204031.56297: variable 'ansible_module_compression' from source: unknown 9396 1727204031.56315: variable 'ansible_shell_type' from source: unknown 9396 1727204031.56318: variable 'ansible_shell_executable' from source: unknown 9396 1727204031.56320: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204031.56322: variable 'ansible_pipelining' from source: unknown 9396 1727204031.56325: variable 'ansible_timeout' from source: unknown 9396 1727204031.56399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204031.56721: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204031.56725: variable 'omit' from source: magic vars 9396 1727204031.56728: starting attempt loop 9396 1727204031.56732: running the handler 9396 1727204031.56734: variable 'ansible_facts' from source: unknown 9396 1727204031.56737: variable 'ansible_facts' from source: unknown 9396 1727204031.56740: _low_level_execute_command(): starting 9396 1727204031.56742: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204031.58440: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204031.58630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204031.58634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204031.58661: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204031.58805: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204031.60485: stdout chunk (state=3): >>>/root <<< 9396 1727204031.60896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204031.60899: stderr chunk (state=3): >>><<< 9396 1727204031.60902: stdout chunk (state=3): >>><<< 9396 1727204031.60921: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204031.60937: _low_level_execute_command(): starting 9396 1727204031.60944: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185 `" && echo ansible-tmp-1727204031.6092248-10347-243091782061185="` echo /root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185 `" ) && sleep 0' 9396 1727204031.62126: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204031.62206: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204031.62634: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204031.62736: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204031.64832: stdout chunk (state=3): >>>ansible-tmp-1727204031.6092248-10347-243091782061185=/root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185 <<< 9396 1727204031.64946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204031.65094: stderr chunk (state=3): >>><<< 9396 1727204031.65097: stdout chunk (state=3): >>><<< 9396 1727204031.65186: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204031.6092248-10347-243091782061185=/root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204031.65228: variable 'ansible_module_compression' from source: unknown 9396 1727204031.65285: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 9396 1727204031.65444: variable 'ansible_facts' from source: unknown 9396 1727204031.65664: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185/AnsiballZ_dnf.py 9396 1727204031.66017: Sending initial data 9396 1727204031.66020: Sent initial data (151 bytes) 9396 1727204031.67464: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204031.67471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204031.67605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204031.67694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204031.67703: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204031.67795: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204031.67830: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204031.69619: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204031.69658: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpg5n73n09 /root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185/AnsiballZ_dnf.py <<< 9396 1727204031.69662: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185/AnsiballZ_dnf.py" <<< 9396 1727204031.69922: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpg5n73n09" to remote "/root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185/AnsiballZ_dnf.py" <<< 9396 1727204031.72402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204031.72430: stderr chunk (state=3): >>><<< 9396 1727204031.72439: stdout chunk (state=3): >>><<< 9396 1727204031.72459: done transferring module to remote 9396 1727204031.72472: _low_level_execute_command(): starting 9396 1727204031.72478: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185/ /root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185/AnsiballZ_dnf.py && sleep 0' 9396 1727204031.74089: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204031.74095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204031.74098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204031.74101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204031.74346: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204031.76395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204031.76399: stdout chunk (state=3): >>><<< 9396 1727204031.76414: stderr chunk (state=3): >>><<< 9396 1727204031.76432: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204031.76436: _low_level_execute_command(): starting 9396 1727204031.76444: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185/AnsiballZ_dnf.py && sleep 0' 9396 1727204031.77756: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204031.77985: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204031.78130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204031.78177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204033.31199: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 9396 1727204033.36020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204033.36047: stderr chunk (state=3): >>><<< 9396 1727204033.36057: stdout chunk (state=3): >>><<< 9396 1727204033.36203: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204033.36213: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204033.36216: _low_level_execute_command(): starting 9396 1727204033.36218: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204031.6092248-10347-243091782061185/ > /dev/null 2>&1 && sleep 0' 9396 1727204033.36920: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 9396 1727204033.37007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 9396 1727204033.37023: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204033.37042: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204033.37119: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204033.39069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204033.39183: stderr chunk (state=3): >>><<< 9396 1727204033.39187: stdout chunk (state=3): >>><<< 9396 1727204033.39396: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204033.39400: handler run complete 9396 1727204033.39402: attempt loop complete, returning result 9396 1727204033.39405: _execute() done 9396 1727204033.39407: dumping result to json 9396 1727204033.39409: done dumping result, returning 9396 1727204033.39411: done running TaskExecutor() for managed-node1/TASK: Install pgrep, sysctl [12b410aa-8751-36c5-1f9e-000000000011] 9396 1727204033.39414: sending task result for task 12b410aa-8751-36c5-1f9e-000000000011 ok: [managed-node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 9396 1727204033.39672: no more pending results, returning what we have 9396 1727204033.39677: results queue empty 9396 1727204033.39678: checking for any_errors_fatal 9396 1727204033.39687: done checking for any_errors_fatal 9396 1727204033.39859: checking for max_fail_percentage 9396 1727204033.39863: done checking for max_fail_percentage 9396 1727204033.39864: checking to see if all hosts have failed and the running result is not ok 9396 1727204033.39865: done checking to see if all hosts have failed 9396 1727204033.39866: getting the remaining hosts for this loop 9396 1727204033.39868: done getting the remaining hosts for this loop 9396 1727204033.39900: getting the next task for host managed-node1 9396 1727204033.39908: done getting next task for host managed-node1 9396 1727204033.39912: ^ task is: TASK: Create test interfaces 9396 1727204033.39917: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204033.40035: getting variables 9396 1727204033.40037: in VariableManager get_vars() 9396 1727204033.40435: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000011 9396 1727204033.40442: WORKER PROCESS EXITING 9396 1727204033.40454: Calling all_inventory to load vars for managed-node1 9396 1727204033.40457: Calling groups_inventory to load vars for managed-node1 9396 1727204033.40462: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204033.40475: Calling all_plugins_play to load vars for managed-node1 9396 1727204033.40479: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204033.40483: Calling groups_plugins_play to load vars for managed-node1 9396 1727204033.41238: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204033.41636: done with get_vars() 9396 1727204033.41650: done getting variables 9396 1727204033.41780: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Tuesday 24 September 2024 14:53:53 -0400 (0:00:01.948) 0:00:09.389 ***** 9396 1727204033.41822: entering _queue_task() for managed-node1/shell 9396 1727204033.41824: Creating lock for shell 9396 1727204033.42162: worker is 1 (out of 1 available) 9396 1727204033.42177: exiting _queue_task() for managed-node1/shell 9396 1727204033.42397: done queuing things up, now waiting for results queue to drain 9396 1727204033.42399: waiting for pending results... 9396 1727204033.42522: running TaskExecutor() for managed-node1/TASK: Create test interfaces 9396 1727204033.42726: in run() - task 12b410aa-8751-36c5-1f9e-000000000012 9396 1727204033.42731: variable 'ansible_search_path' from source: unknown 9396 1727204033.42734: variable 'ansible_search_path' from source: unknown 9396 1727204033.42738: calling self._execute() 9396 1727204033.42820: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204033.42843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204033.42944: variable 'omit' from source: magic vars 9396 1727204033.43350: variable 'ansible_distribution_major_version' from source: facts 9396 1727204033.43380: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204033.43402: variable 'omit' from source: magic vars 9396 1727204033.43468: variable 'omit' from source: magic vars 9396 1727204033.44476: variable 'dhcp_interface1' from source: play vars 9396 1727204033.44480: variable 'dhcp_interface2' from source: play vars 9396 1727204033.44483: variable 'omit' from source: magic vars 9396 1727204033.44580: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204033.44693: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204033.44761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204033.44828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204033.45029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204033.45033: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204033.45036: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204033.45039: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204033.45291: Set connection var ansible_timeout to 10 9396 1727204033.45599: Set connection var ansible_shell_executable to /bin/sh 9396 1727204033.45604: Set connection var ansible_pipelining to False 9396 1727204033.45606: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204033.45609: Set connection var ansible_connection to ssh 9396 1727204033.45611: Set connection var ansible_shell_type to sh 9396 1727204033.45613: variable 'ansible_shell_executable' from source: unknown 9396 1727204033.45615: variable 'ansible_connection' from source: unknown 9396 1727204033.45618: variable 'ansible_module_compression' from source: unknown 9396 1727204033.45620: variable 'ansible_shell_type' from source: unknown 9396 1727204033.45622: variable 'ansible_shell_executable' from source: unknown 9396 1727204033.45624: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204033.45626: variable 'ansible_pipelining' from source: unknown 9396 1727204033.45628: variable 'ansible_timeout' from source: unknown 9396 1727204033.45630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204033.45921: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204033.46040: variable 'omit' from source: magic vars 9396 1727204033.46043: starting attempt loop 9396 1727204033.46045: running the handler 9396 1727204033.46048: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204033.46258: _low_level_execute_command(): starting 9396 1727204033.46261: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204033.47703: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204033.47919: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204033.48174: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204033.48217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204033.49960: stdout chunk (state=3): >>>/root <<< 9396 1727204033.50120: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204033.50169: stderr chunk (state=3): >>><<< 9396 1727204033.50194: stdout chunk (state=3): >>><<< 9396 1727204033.50229: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204033.50259: _low_level_execute_command(): starting 9396 1727204033.50272: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342 `" && echo ansible-tmp-1727204033.5023575-10491-18778222470342="` echo /root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342 `" ) && sleep 0' 9396 1727204033.50903: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204033.50924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204033.50951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204033.50981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204033.51009: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204033.51061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204033.51125: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204033.51146: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204033.51178: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204033.51458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204033.53548: stdout chunk (state=3): >>>ansible-tmp-1727204033.5023575-10491-18778222470342=/root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342 <<< 9396 1727204033.53746: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204033.53757: stdout chunk (state=3): >>><<< 9396 1727204033.53782: stderr chunk (state=3): >>><<< 9396 1727204033.53819: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204033.5023575-10491-18778222470342=/root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204033.53949: variable 'ansible_module_compression' from source: unknown 9396 1727204033.53953: ANSIBALLZ: Using generic lock for ansible.legacy.command 9396 1727204033.53956: ANSIBALLZ: Acquiring lock 9396 1727204033.53967: ANSIBALLZ: Lock acquired: 139797141880704 9396 1727204033.53978: ANSIBALLZ: Creating module 9396 1727204033.79269: ANSIBALLZ: Writing module into payload 9396 1727204033.79399: ANSIBALLZ: Writing module 9396 1727204033.79440: ANSIBALLZ: Renaming module 9396 1727204033.79451: ANSIBALLZ: Done creating module 9396 1727204033.79472: variable 'ansible_facts' from source: unknown 9396 1727204033.79558: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342/AnsiballZ_command.py 9396 1727204033.79796: Sending initial data 9396 1727204033.79865: Sent initial data (154 bytes) 9396 1727204033.80462: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204033.80481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204033.80524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204033.80632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204033.80655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204033.80672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204033.80695: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204033.80858: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204033.82596: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 9396 1727204033.82616: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 9396 1727204033.82646: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204033.82688: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204033.82764: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342/AnsiballZ_command.py" <<< 9396 1727204033.82768: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpn1jb8ele /root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342/AnsiballZ_command.py <<< 9396 1727204033.82823: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpn1jb8ele" to remote "/root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342/AnsiballZ_command.py" <<< 9396 1727204033.85194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204033.85200: stdout chunk (state=3): >>><<< 9396 1727204033.85202: stderr chunk (state=3): >>><<< 9396 1727204033.85205: done transferring module to remote 9396 1727204033.85209: _low_level_execute_command(): starting 9396 1727204033.85212: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342/ /root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342/AnsiballZ_command.py && sleep 0' 9396 1727204033.86211: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204033.86311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204033.86503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204033.86584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204033.86756: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204033.88802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204033.88806: stdout chunk (state=3): >>><<< 9396 1727204033.88808: stderr chunk (state=3): >>><<< 9396 1727204033.88812: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204033.88815: _low_level_execute_command(): starting 9396 1727204033.88817: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342/AnsiballZ_command.py && sleep 0' 9396 1727204033.89338: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204033.89355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204033.89382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204033.89424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204033.89431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204033.89438: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204033.89515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204035.34295: stdout chunk (state=3): >>> <<< 9396 1727204035.34303: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 651 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 651 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:53:54.071384", "end": "2024-09-24 14:53:55.338533", "delta": "0:00:01.267149", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9396 1727204035.35729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204035.35733: stderr chunk (state=3): >>>Shared connection to 10.31.11.210 closed. <<< 9396 1727204035.36028: stderr chunk (state=3): >>><<< 9396 1727204035.36032: stdout chunk (state=3): >>><<< 9396 1727204035.36069: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 651 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 651 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-24 14:53:54.071384", "end": "2024-09-24 14:53:55.338533", "delta": "0:00:01.267149", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204035.36143: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204035.36154: _low_level_execute_command(): starting 9396 1727204035.36179: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204033.5023575-10491-18778222470342/ > /dev/null 2>&1 && sleep 0' 9396 1727204035.37424: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204035.37496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204035.37500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204035.37502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204035.37505: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204035.37507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204035.37509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204035.37655: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204035.37679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204035.37864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204035.39933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204035.39937: stderr chunk (state=3): >>><<< 9396 1727204035.39944: stdout chunk (state=3): >>><<< 9396 1727204035.40114: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204035.40123: handler run complete 9396 1727204035.40155: Evaluated conditional (False): False 9396 1727204035.40172: attempt loop complete, returning result 9396 1727204035.40175: _execute() done 9396 1727204035.40178: dumping result to json 9396 1727204035.40224: done dumping result, returning 9396 1727204035.40228: done running TaskExecutor() for managed-node1/TASK: Create test interfaces [12b410aa-8751-36c5-1f9e-000000000012] 9396 1727204035.40231: sending task result for task 12b410aa-8751-36c5-1f9e-000000000012 9396 1727204035.40516: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000012 9396 1727204035.40519: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.267149", "end": "2024-09-24 14:53:55.338533", "rc": 0, "start": "2024-09-24 14:53:54.071384" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 651 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 651 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 9396 1727204035.40615: no more pending results, returning what we have 9396 1727204035.40796: results queue empty 9396 1727204035.40798: checking for any_errors_fatal 9396 1727204035.40807: done checking for any_errors_fatal 9396 1727204035.40808: checking for max_fail_percentage 9396 1727204035.40811: done checking for max_fail_percentage 9396 1727204035.40812: checking to see if all hosts have failed and the running result is not ok 9396 1727204035.40813: done checking to see if all hosts have failed 9396 1727204035.40814: getting the remaining hosts for this loop 9396 1727204035.40815: done getting the remaining hosts for this loop 9396 1727204035.40820: getting the next task for host managed-node1 9396 1727204035.40829: done getting next task for host managed-node1 9396 1727204035.40833: ^ task is: TASK: Include the task 'get_interface_stat.yml' 9396 1727204035.40836: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204035.40841: getting variables 9396 1727204035.40842: in VariableManager get_vars() 9396 1727204035.40885: Calling all_inventory to load vars for managed-node1 9396 1727204035.40888: Calling groups_inventory to load vars for managed-node1 9396 1727204035.40900: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204035.40912: Calling all_plugins_play to load vars for managed-node1 9396 1727204035.40915: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204035.40919: Calling groups_plugins_play to load vars for managed-node1 9396 1727204035.41358: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204035.41958: done with get_vars() 9396 1727204035.41972: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:53:55 -0400 (0:00:02.003) 0:00:11.392 ***** 9396 1727204035.42198: entering _queue_task() for managed-node1/include_tasks 9396 1727204035.42934: worker is 1 (out of 1 available) 9396 1727204035.42948: exiting _queue_task() for managed-node1/include_tasks 9396 1727204035.43001: done queuing things up, now waiting for results queue to drain 9396 1727204035.43003: waiting for pending results... 9396 1727204035.43510: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 9396 1727204035.43795: in run() - task 12b410aa-8751-36c5-1f9e-000000000016 9396 1727204035.43995: variable 'ansible_search_path' from source: unknown 9396 1727204035.44000: variable 'ansible_search_path' from source: unknown 9396 1727204035.44003: calling self._execute() 9396 1727204035.44007: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204035.44012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204035.44015: variable 'omit' from source: magic vars 9396 1727204035.45100: variable 'ansible_distribution_major_version' from source: facts 9396 1727204035.45124: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204035.45137: _execute() done 9396 1727204035.45148: dumping result to json 9396 1727204035.45158: done dumping result, returning 9396 1727204035.45170: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-36c5-1f9e-000000000016] 9396 1727204035.45183: sending task result for task 12b410aa-8751-36c5-1f9e-000000000016 9396 1727204035.45324: no more pending results, returning what we have 9396 1727204035.45331: in VariableManager get_vars() 9396 1727204035.45488: Calling all_inventory to load vars for managed-node1 9396 1727204035.45492: Calling groups_inventory to load vars for managed-node1 9396 1727204035.45498: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204035.45515: Calling all_plugins_play to load vars for managed-node1 9396 1727204035.45519: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204035.45524: Calling groups_plugins_play to load vars for managed-node1 9396 1727204035.45996: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000016 9396 1727204035.46000: WORKER PROCESS EXITING 9396 1727204035.46144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204035.46701: done with get_vars() 9396 1727204035.46710: variable 'ansible_search_path' from source: unknown 9396 1727204035.46712: variable 'ansible_search_path' from source: unknown 9396 1727204035.46759: we have included files to process 9396 1727204035.46760: generating all_blocks data 9396 1727204035.46762: done generating all_blocks data 9396 1727204035.46763: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 9396 1727204035.46764: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 9396 1727204035.46767: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 9396 1727204035.47328: done processing included file 9396 1727204035.47331: iterating over new_blocks loaded from include file 9396 1727204035.47333: in VariableManager get_vars() 9396 1727204035.47478: done with get_vars() 9396 1727204035.47480: filtering new block on tags 9396 1727204035.47503: done filtering new block on tags 9396 1727204035.47506: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 9396 1727204035.47512: extending task lists for all hosts with included blocks 9396 1727204035.47765: done extending task lists 9396 1727204035.47766: done processing included files 9396 1727204035.47767: results queue empty 9396 1727204035.47768: checking for any_errors_fatal 9396 1727204035.47775: done checking for any_errors_fatal 9396 1727204035.47776: checking for max_fail_percentage 9396 1727204035.47777: done checking for max_fail_percentage 9396 1727204035.47778: checking to see if all hosts have failed and the running result is not ok 9396 1727204035.47779: done checking to see if all hosts have failed 9396 1727204035.47780: getting the remaining hosts for this loop 9396 1727204035.47781: done getting the remaining hosts for this loop 9396 1727204035.47896: getting the next task for host managed-node1 9396 1727204035.47903: done getting next task for host managed-node1 9396 1727204035.47905: ^ task is: TASK: Get stat for interface {{ interface }} 9396 1727204035.47909: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204035.47912: getting variables 9396 1727204035.47913: in VariableManager get_vars() 9396 1727204035.47929: Calling all_inventory to load vars for managed-node1 9396 1727204035.47931: Calling groups_inventory to load vars for managed-node1 9396 1727204035.47934: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204035.47941: Calling all_plugins_play to load vars for managed-node1 9396 1727204035.47944: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204035.47948: Calling groups_plugins_play to load vars for managed-node1 9396 1727204035.48386: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204035.49079: done with get_vars() 9396 1727204035.49217: done getting variables 9396 1727204035.49493: variable 'interface' from source: task vars 9396 1727204035.49499: variable 'dhcp_interface1' from source: play vars 9396 1727204035.49742: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:53:55 -0400 (0:00:00.076) 0:00:11.469 ***** 9396 1727204035.49833: entering _queue_task() for managed-node1/stat 9396 1727204035.50542: worker is 1 (out of 1 available) 9396 1727204035.50554: exiting _queue_task() for managed-node1/stat 9396 1727204035.50568: done queuing things up, now waiting for results queue to drain 9396 1727204035.50570: waiting for pending results... 9396 1727204035.51081: running TaskExecutor() for managed-node1/TASK: Get stat for interface test1 9396 1727204035.51421: in run() - task 12b410aa-8751-36c5-1f9e-000000000153 9396 1727204035.51432: variable 'ansible_search_path' from source: unknown 9396 1727204035.51436: variable 'ansible_search_path' from source: unknown 9396 1727204035.51476: calling self._execute() 9396 1727204035.51572: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204035.51579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204035.51592: variable 'omit' from source: magic vars 9396 1727204035.52610: variable 'ansible_distribution_major_version' from source: facts 9396 1727204035.52680: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204035.52684: variable 'omit' from source: magic vars 9396 1727204035.52699: variable 'omit' from source: magic vars 9396 1727204035.53022: variable 'interface' from source: task vars 9396 1727204035.53026: variable 'dhcp_interface1' from source: play vars 9396 1727204035.53104: variable 'dhcp_interface1' from source: play vars 9396 1727204035.53131: variable 'omit' from source: magic vars 9396 1727204035.53177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204035.53500: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204035.53504: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204035.53507: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204035.53509: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204035.53531: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204035.53534: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204035.53537: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204035.53896: Set connection var ansible_timeout to 10 9396 1727204035.53900: Set connection var ansible_shell_executable to /bin/sh 9396 1727204035.53902: Set connection var ansible_pipelining to False 9396 1727204035.53904: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204035.53907: Set connection var ansible_connection to ssh 9396 1727204035.53909: Set connection var ansible_shell_type to sh 9396 1727204035.53937: variable 'ansible_shell_executable' from source: unknown 9396 1727204035.53942: variable 'ansible_connection' from source: unknown 9396 1727204035.53944: variable 'ansible_module_compression' from source: unknown 9396 1727204035.53947: variable 'ansible_shell_type' from source: unknown 9396 1727204035.53949: variable 'ansible_shell_executable' from source: unknown 9396 1727204035.53954: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204035.53959: variable 'ansible_pipelining' from source: unknown 9396 1727204035.53962: variable 'ansible_timeout' from source: unknown 9396 1727204035.53969: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204035.54479: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204035.54488: variable 'omit' from source: magic vars 9396 1727204035.54498: starting attempt loop 9396 1727204035.54501: running the handler 9396 1727204035.54523: _low_level_execute_command(): starting 9396 1727204035.54580: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204035.55998: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204035.56008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204035.56204: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204035.56208: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 9396 1727204035.56219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204035.56231: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204035.56234: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204035.56255: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204035.56416: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204035.58071: stdout chunk (state=3): >>>/root <<< 9396 1727204035.58332: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204035.58495: stderr chunk (state=3): >>><<< 9396 1727204035.58498: stdout chunk (state=3): >>><<< 9396 1727204035.58618: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204035.58633: _low_level_execute_command(): starting 9396 1727204035.58640: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968 `" && echo ansible-tmp-1727204035.5861795-10692-55415662641968="` echo /root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968 `" ) && sleep 0' 9396 1727204035.59820: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204035.60080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204035.60099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204035.60102: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204035.60105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204035.60107: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204035.60109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204035.60143: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204035.60181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204035.62381: stdout chunk (state=3): >>>ansible-tmp-1727204035.5861795-10692-55415662641968=/root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968 <<< 9396 1727204035.62385: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204035.62394: stdout chunk (state=3): >>><<< 9396 1727204035.62397: stderr chunk (state=3): >>><<< 9396 1727204035.62419: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204035.5861795-10692-55415662641968=/root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204035.62475: variable 'ansible_module_compression' from source: unknown 9396 1727204035.62581: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9396 1727204035.62848: variable 'ansible_facts' from source: unknown 9396 1727204035.62946: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968/AnsiballZ_stat.py 9396 1727204035.63655: Sending initial data 9396 1727204035.63658: Sent initial data (151 bytes) 9396 1727204035.65063: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204035.65155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204035.65164: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204035.65408: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204035.65625: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204035.67527: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204035.67549: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204035.67605: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmp6ok1u_hs /root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968/AnsiballZ_stat.py <<< 9396 1727204035.67614: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968/AnsiballZ_stat.py" <<< 9396 1727204035.68206: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmp6ok1u_hs" to remote "/root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968/AnsiballZ_stat.py" <<< 9396 1727204035.70888: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204035.71314: stderr chunk (state=3): >>><<< 9396 1727204035.71321: stdout chunk (state=3): >>><<< 9396 1727204035.71396: done transferring module to remote 9396 1727204035.71400: _low_level_execute_command(): starting 9396 1727204035.71403: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968/ /root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968/AnsiballZ_stat.py && sleep 0' 9396 1727204035.72771: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204035.72774: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204035.72801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204035.72916: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204035.72950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204035.74948: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204035.75155: stderr chunk (state=3): >>><<< 9396 1727204035.75159: stdout chunk (state=3): >>><<< 9396 1727204035.75162: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204035.75164: _low_level_execute_command(): starting 9396 1727204035.75168: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968/AnsiballZ_stat.py && sleep 0' 9396 1727204035.76425: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204035.76440: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204035.76457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204035.76654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204035.76716: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204035.76836: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204035.76916: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204035.94976: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34023, "dev": 23, "nlink": 1, "atime": 1727204034.0792925, "mtime": 1727204034.0792925, "ctime": 1727204034.0792925, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 9396 1727204035.96650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204035.96654: stdout chunk (state=3): >>><<< 9396 1727204035.96656: stderr chunk (state=3): >>><<< 9396 1727204035.96675: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34023, "dev": 23, "nlink": 1, "atime": 1727204034.0792925, "mtime": 1727204034.0792925, "ctime": 1727204034.0792925, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204035.96848: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204035.96867: _low_level_execute_command(): starting 9396 1727204035.96960: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204035.5861795-10692-55415662641968/ > /dev/null 2>&1 && sleep 0' 9396 1727204035.98291: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204035.98300: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204035.98303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204035.98356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204035.98546: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204035.98574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204036.00809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204036.00813: stdout chunk (state=3): >>><<< 9396 1727204036.00816: stderr chunk (state=3): >>><<< 9396 1727204036.00818: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204036.00821: handler run complete 9396 1727204036.01097: attempt loop complete, returning result 9396 1727204036.01101: _execute() done 9396 1727204036.01103: dumping result to json 9396 1727204036.01106: done dumping result, returning 9396 1727204036.01108: done running TaskExecutor() for managed-node1/TASK: Get stat for interface test1 [12b410aa-8751-36c5-1f9e-000000000153] 9396 1727204036.01110: sending task result for task 12b410aa-8751-36c5-1f9e-000000000153 ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204034.0792925, "block_size": 4096, "blocks": 0, "ctime": 1727204034.0792925, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 34023, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1727204034.0792925, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 9396 1727204036.01612: no more pending results, returning what we have 9396 1727204036.01617: results queue empty 9396 1727204036.01619: checking for any_errors_fatal 9396 1727204036.01620: done checking for any_errors_fatal 9396 1727204036.01621: checking for max_fail_percentage 9396 1727204036.01623: done checking for max_fail_percentage 9396 1727204036.01624: checking to see if all hosts have failed and the running result is not ok 9396 1727204036.01625: done checking to see if all hosts have failed 9396 1727204036.01626: getting the remaining hosts for this loop 9396 1727204036.01628: done getting the remaining hosts for this loop 9396 1727204036.01634: getting the next task for host managed-node1 9396 1727204036.01643: done getting next task for host managed-node1 9396 1727204036.01646: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 9396 1727204036.01650: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204036.01655: getting variables 9396 1727204036.01657: in VariableManager get_vars() 9396 1727204036.02246: Calling all_inventory to load vars for managed-node1 9396 1727204036.02250: Calling groups_inventory to load vars for managed-node1 9396 1727204036.02253: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204036.02266: Calling all_plugins_play to load vars for managed-node1 9396 1727204036.02269: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204036.02274: Calling groups_plugins_play to load vars for managed-node1 9396 1727204036.03002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204036.03493: done with get_vars() 9396 1727204036.03510: done getting variables 9396 1727204036.03877: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 9396 1727204036.04025: variable 'interface' from source: task vars 9396 1727204036.04030: variable 'dhcp_interface1' from source: play vars 9396 1727204036.04104: variable 'dhcp_interface1' from source: play vars 9396 1727204036.04296: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000153 9396 1727204036.04300: WORKER PROCESS EXITING TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:53:56 -0400 (0:00:00.544) 0:00:12.014 ***** 9396 1727204036.04318: entering _queue_task() for managed-node1/assert 9396 1727204036.04321: Creating lock for assert 9396 1727204036.05088: worker is 1 (out of 1 available) 9396 1727204036.05104: exiting _queue_task() for managed-node1/assert 9396 1727204036.05121: done queuing things up, now waiting for results queue to drain 9396 1727204036.05123: waiting for pending results... 9396 1727204036.05614: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test1' 9396 1727204036.05904: in run() - task 12b410aa-8751-36c5-1f9e-000000000017 9396 1727204036.05968: variable 'ansible_search_path' from source: unknown 9396 1727204036.06002: variable 'ansible_search_path' from source: unknown 9396 1727204036.06052: calling self._execute() 9396 1727204036.06391: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.06409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.06429: variable 'omit' from source: magic vars 9396 1727204036.07486: variable 'ansible_distribution_major_version' from source: facts 9396 1727204036.07511: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204036.07694: variable 'omit' from source: magic vars 9396 1727204036.07699: variable 'omit' from source: magic vars 9396 1727204036.07878: variable 'interface' from source: task vars 9396 1727204036.08133: variable 'dhcp_interface1' from source: play vars 9396 1727204036.08136: variable 'dhcp_interface1' from source: play vars 9396 1727204036.08202: variable 'omit' from source: magic vars 9396 1727204036.08260: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204036.08394: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204036.08425: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204036.08453: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204036.08677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204036.08684: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204036.08687: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.08695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.09266: Set connection var ansible_timeout to 10 9396 1727204036.09340: Set connection var ansible_shell_executable to /bin/sh 9396 1727204036.09399: Set connection var ansible_pipelining to False 9396 1727204036.09503: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204036.09797: Set connection var ansible_connection to ssh 9396 1727204036.09800: Set connection var ansible_shell_type to sh 9396 1727204036.09803: variable 'ansible_shell_executable' from source: unknown 9396 1727204036.09805: variable 'ansible_connection' from source: unknown 9396 1727204036.09809: variable 'ansible_module_compression' from source: unknown 9396 1727204036.09812: variable 'ansible_shell_type' from source: unknown 9396 1727204036.09814: variable 'ansible_shell_executable' from source: unknown 9396 1727204036.09816: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.09819: variable 'ansible_pipelining' from source: unknown 9396 1727204036.09821: variable 'ansible_timeout' from source: unknown 9396 1727204036.09823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.10254: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204036.10344: variable 'omit' from source: magic vars 9396 1727204036.10362: starting attempt loop 9396 1727204036.10468: running the handler 9396 1727204036.11128: variable 'interface_stat' from source: set_fact 9396 1727204036.11219: Evaluated conditional (interface_stat.stat.exists): True 9396 1727204036.11614: handler run complete 9396 1727204036.11618: attempt loop complete, returning result 9396 1727204036.11621: _execute() done 9396 1727204036.11623: dumping result to json 9396 1727204036.11630: done dumping result, returning 9396 1727204036.11632: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test1' [12b410aa-8751-36c5-1f9e-000000000017] 9396 1727204036.11635: sending task result for task 12b410aa-8751-36c5-1f9e-000000000017 9396 1727204036.12001: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000017 9396 1727204036.12005: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 9396 1727204036.12065: no more pending results, returning what we have 9396 1727204036.12069: results queue empty 9396 1727204036.12070: checking for any_errors_fatal 9396 1727204036.12078: done checking for any_errors_fatal 9396 1727204036.12079: checking for max_fail_percentage 9396 1727204036.12081: done checking for max_fail_percentage 9396 1727204036.12082: checking to see if all hosts have failed and the running result is not ok 9396 1727204036.12083: done checking to see if all hosts have failed 9396 1727204036.12091: getting the remaining hosts for this loop 9396 1727204036.12092: done getting the remaining hosts for this loop 9396 1727204036.12096: getting the next task for host managed-node1 9396 1727204036.12110: done getting next task for host managed-node1 9396 1727204036.12114: ^ task is: TASK: Include the task 'get_interface_stat.yml' 9396 1727204036.12120: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204036.12124: getting variables 9396 1727204036.12129: in VariableManager get_vars() 9396 1727204036.12178: Calling all_inventory to load vars for managed-node1 9396 1727204036.12182: Calling groups_inventory to load vars for managed-node1 9396 1727204036.12185: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204036.12334: Calling all_plugins_play to load vars for managed-node1 9396 1727204036.12339: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204036.12344: Calling groups_plugins_play to load vars for managed-node1 9396 1727204036.13108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204036.13768: done with get_vars() 9396 1727204036.13783: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:53:56 -0400 (0:00:00.098) 0:00:12.112 ***** 9396 1727204036.14150: entering _queue_task() for managed-node1/include_tasks 9396 1727204036.14813: worker is 1 (out of 1 available) 9396 1727204036.14902: exiting _queue_task() for managed-node1/include_tasks 9396 1727204036.14917: done queuing things up, now waiting for results queue to drain 9396 1727204036.14919: waiting for pending results... 9396 1727204036.15509: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 9396 1727204036.15516: in run() - task 12b410aa-8751-36c5-1f9e-00000000001b 9396 1727204036.15708: variable 'ansible_search_path' from source: unknown 9396 1727204036.15712: variable 'ansible_search_path' from source: unknown 9396 1727204036.15895: calling self._execute() 9396 1727204036.15900: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.15903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.15905: variable 'omit' from source: magic vars 9396 1727204036.16678: variable 'ansible_distribution_major_version' from source: facts 9396 1727204036.17079: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204036.17086: _execute() done 9396 1727204036.17091: dumping result to json 9396 1727204036.17094: done dumping result, returning 9396 1727204036.17097: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-36c5-1f9e-00000000001b] 9396 1727204036.17104: sending task result for task 12b410aa-8751-36c5-1f9e-00000000001b 9396 1727204036.17215: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000001b 9396 1727204036.17220: WORKER PROCESS EXITING 9396 1727204036.17260: no more pending results, returning what we have 9396 1727204036.17266: in VariableManager get_vars() 9396 1727204036.17325: Calling all_inventory to load vars for managed-node1 9396 1727204036.17329: Calling groups_inventory to load vars for managed-node1 9396 1727204036.17332: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204036.17350: Calling all_plugins_play to load vars for managed-node1 9396 1727204036.17354: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204036.17359: Calling groups_plugins_play to load vars for managed-node1 9396 1727204036.17985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204036.18848: done with get_vars() 9396 1727204036.18869: variable 'ansible_search_path' from source: unknown 9396 1727204036.18870: variable 'ansible_search_path' from source: unknown 9396 1727204036.18983: we have included files to process 9396 1727204036.18985: generating all_blocks data 9396 1727204036.18986: done generating all_blocks data 9396 1727204036.19022: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 9396 1727204036.19023: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 9396 1727204036.19027: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 9396 1727204036.19595: done processing included file 9396 1727204036.19598: iterating over new_blocks loaded from include file 9396 1727204036.19600: in VariableManager get_vars() 9396 1727204036.19696: done with get_vars() 9396 1727204036.19699: filtering new block on tags 9396 1727204036.19761: done filtering new block on tags 9396 1727204036.19764: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 9396 1727204036.19769: extending task lists for all hosts with included blocks 9396 1727204036.20105: done extending task lists 9396 1727204036.20107: done processing included files 9396 1727204036.20262: results queue empty 9396 1727204036.20264: checking for any_errors_fatal 9396 1727204036.20269: done checking for any_errors_fatal 9396 1727204036.20270: checking for max_fail_percentage 9396 1727204036.20272: done checking for max_fail_percentage 9396 1727204036.20273: checking to see if all hosts have failed and the running result is not ok 9396 1727204036.20274: done checking to see if all hosts have failed 9396 1727204036.20275: getting the remaining hosts for this loop 9396 1727204036.20276: done getting the remaining hosts for this loop 9396 1727204036.20280: getting the next task for host managed-node1 9396 1727204036.20285: done getting next task for host managed-node1 9396 1727204036.20287: ^ task is: TASK: Get stat for interface {{ interface }} 9396 1727204036.20292: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204036.20294: getting variables 9396 1727204036.20296: in VariableManager get_vars() 9396 1727204036.20312: Calling all_inventory to load vars for managed-node1 9396 1727204036.20314: Calling groups_inventory to load vars for managed-node1 9396 1727204036.20317: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204036.20324: Calling all_plugins_play to load vars for managed-node1 9396 1727204036.20327: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204036.20330: Calling groups_plugins_play to load vars for managed-node1 9396 1727204036.20737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204036.21339: done with get_vars() 9396 1727204036.21350: done getting variables 9396 1727204036.21660: variable 'interface' from source: task vars 9396 1727204036.21664: variable 'dhcp_interface2' from source: play vars 9396 1727204036.21746: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:53:56 -0400 (0:00:00.076) 0:00:12.188 ***** 9396 1727204036.21785: entering _queue_task() for managed-node1/stat 9396 1727204036.22251: worker is 1 (out of 1 available) 9396 1727204036.22265: exiting _queue_task() for managed-node1/stat 9396 1727204036.22278: done queuing things up, now waiting for results queue to drain 9396 1727204036.22280: waiting for pending results... 9396 1727204036.22620: running TaskExecutor() for managed-node1/TASK: Get stat for interface test2 9396 1727204036.22711: in run() - task 12b410aa-8751-36c5-1f9e-00000000016b 9396 1727204036.22733: variable 'ansible_search_path' from source: unknown 9396 1727204036.22742: variable 'ansible_search_path' from source: unknown 9396 1727204036.22808: calling self._execute() 9396 1727204036.22885: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.22901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.22995: variable 'omit' from source: magic vars 9396 1727204036.23364: variable 'ansible_distribution_major_version' from source: facts 9396 1727204036.23383: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204036.23396: variable 'omit' from source: magic vars 9396 1727204036.23482: variable 'omit' from source: magic vars 9396 1727204036.23622: variable 'interface' from source: task vars 9396 1727204036.23633: variable 'dhcp_interface2' from source: play vars 9396 1727204036.23722: variable 'dhcp_interface2' from source: play vars 9396 1727204036.23747: variable 'omit' from source: magic vars 9396 1727204036.23812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204036.23861: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204036.23909: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204036.24130: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204036.24134: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204036.24136: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204036.24139: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.24141: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.24245: Set connection var ansible_timeout to 10 9396 1727204036.24267: Set connection var ansible_shell_executable to /bin/sh 9396 1727204036.24285: Set connection var ansible_pipelining to False 9396 1727204036.24303: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204036.24316: Set connection var ansible_connection to ssh 9396 1727204036.24324: Set connection var ansible_shell_type to sh 9396 1727204036.24376: variable 'ansible_shell_executable' from source: unknown 9396 1727204036.24387: variable 'ansible_connection' from source: unknown 9396 1727204036.24401: variable 'ansible_module_compression' from source: unknown 9396 1727204036.24411: variable 'ansible_shell_type' from source: unknown 9396 1727204036.24455: variable 'ansible_shell_executable' from source: unknown 9396 1727204036.24458: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.24460: variable 'ansible_pipelining' from source: unknown 9396 1727204036.24463: variable 'ansible_timeout' from source: unknown 9396 1727204036.24476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.24783: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204036.24788: variable 'omit' from source: magic vars 9396 1727204036.24796: starting attempt loop 9396 1727204036.24799: running the handler 9396 1727204036.24801: _low_level_execute_command(): starting 9396 1727204036.24813: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204036.25822: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204036.25910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204036.26007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204036.26040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204036.26208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204036.27917: stdout chunk (state=3): >>>/root <<< 9396 1727204036.28028: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204036.28255: stderr chunk (state=3): >>><<< 9396 1727204036.28258: stdout chunk (state=3): >>><<< 9396 1727204036.28292: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204036.28597: _low_level_execute_command(): starting 9396 1727204036.28603: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420 `" && echo ansible-tmp-1727204036.2834582-10715-186701132783420="` echo /root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420 `" ) && sleep 0' 9396 1727204036.30130: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204036.30213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204036.30436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204036.30455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204036.30543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204036.32648: stdout chunk (state=3): >>>ansible-tmp-1727204036.2834582-10715-186701132783420=/root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420 <<< 9396 1727204036.32843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204036.32874: stdout chunk (state=3): >>><<< 9396 1727204036.33221: stderr chunk (state=3): >>><<< 9396 1727204036.33224: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204036.2834582-10715-186701132783420=/root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204036.33227: variable 'ansible_module_compression' from source: unknown 9396 1727204036.33230: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9396 1727204036.33232: variable 'ansible_facts' from source: unknown 9396 1727204036.33410: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420/AnsiballZ_stat.py 9396 1727204036.34004: Sending initial data 9396 1727204036.34007: Sent initial data (152 bytes) 9396 1727204036.35114: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204036.35117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204036.35120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204036.35123: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204036.35202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204036.35253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204036.35335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204036.37121: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204036.37160: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204036.37206: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmp2_05wn01 /root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420/AnsiballZ_stat.py <<< 9396 1727204036.37255: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420/AnsiballZ_stat.py" <<< 9396 1727204036.37471: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmp2_05wn01" to remote "/root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420/AnsiballZ_stat.py" <<< 9396 1727204036.39215: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204036.39236: stdout chunk (state=3): >>><<< 9396 1727204036.39340: stderr chunk (state=3): >>><<< 9396 1727204036.39344: done transferring module to remote 9396 1727204036.39347: _low_level_execute_command(): starting 9396 1727204036.39350: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420/ /root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420/AnsiballZ_stat.py && sleep 0' 9396 1727204036.40243: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204036.40247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204036.40324: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204036.40328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204036.40380: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204036.40449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204036.42475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204036.42479: stdout chunk (state=3): >>><<< 9396 1727204036.42531: stderr chunk (state=3): >>><<< 9396 1727204036.42616: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204036.42623: _low_level_execute_command(): starting 9396 1727204036.42626: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420/AnsiballZ_stat.py && sleep 0' 9396 1727204036.43591: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204036.43595: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204036.43598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204036.43615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204036.43630: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204036.43678: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204036.43681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204036.43684: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204036.43693: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 9396 1727204036.43696: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 9396 1727204036.43701: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204036.43703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204036.43712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204036.43787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204036.43793: stderr chunk (state=3): >>>debug2: match found <<< 9396 1727204036.43796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204036.43814: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204036.43827: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204036.43838: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204036.43932: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204036.61861: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34429, "dev": 23, "nlink": 1, "atime": 1727204034.0863311, "mtime": 1727204034.0863311, "ctime": 1727204034.0863311, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 9396 1727204036.63576: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204036.63580: stdout chunk (state=3): >>><<< 9396 1727204036.63583: stderr chunk (state=3): >>><<< 9396 1727204036.63586: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34429, "dev": 23, "nlink": 1, "atime": 1727204034.0863311, "mtime": 1727204034.0863311, "ctime": 1727204034.0863311, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204036.63620: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204036.63637: _low_level_execute_command(): starting 9396 1727204036.63648: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204036.2834582-10715-186701132783420/ > /dev/null 2>&1 && sleep 0' 9396 1727204036.64363: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204036.64404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204036.64419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204036.64455: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204036.64566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204036.64585: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204036.64668: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204036.66640: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204036.66748: stderr chunk (state=3): >>><<< 9396 1727204036.66752: stdout chunk (state=3): >>><<< 9396 1727204036.66771: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204036.66779: handler run complete 9396 1727204036.67098: attempt loop complete, returning result 9396 1727204036.67102: _execute() done 9396 1727204036.67104: dumping result to json 9396 1727204036.67106: done dumping result, returning 9396 1727204036.67108: done running TaskExecutor() for managed-node1/TASK: Get stat for interface test2 [12b410aa-8751-36c5-1f9e-00000000016b] 9396 1727204036.67110: sending task result for task 12b410aa-8751-36c5-1f9e-00000000016b 9396 1727204036.67183: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000016b 9396 1727204036.67186: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204034.0863311, "block_size": 4096, "blocks": 0, "ctime": 1727204034.0863311, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 34429, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1727204034.0863311, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 9396 1727204036.67348: no more pending results, returning what we have 9396 1727204036.67353: results queue empty 9396 1727204036.67354: checking for any_errors_fatal 9396 1727204036.67356: done checking for any_errors_fatal 9396 1727204036.67357: checking for max_fail_percentage 9396 1727204036.67359: done checking for max_fail_percentage 9396 1727204036.67360: checking to see if all hosts have failed and the running result is not ok 9396 1727204036.67361: done checking to see if all hosts have failed 9396 1727204036.67362: getting the remaining hosts for this loop 9396 1727204036.67363: done getting the remaining hosts for this loop 9396 1727204036.67368: getting the next task for host managed-node1 9396 1727204036.67376: done getting next task for host managed-node1 9396 1727204036.67379: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 9396 1727204036.67382: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204036.67387: getting variables 9396 1727204036.67499: in VariableManager get_vars() 9396 1727204036.67546: Calling all_inventory to load vars for managed-node1 9396 1727204036.67549: Calling groups_inventory to load vars for managed-node1 9396 1727204036.67553: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204036.67564: Calling all_plugins_play to load vars for managed-node1 9396 1727204036.67567: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204036.67572: Calling groups_plugins_play to load vars for managed-node1 9396 1727204036.67897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204036.68243: done with get_vars() 9396 1727204036.68266: done getting variables 9396 1727204036.68351: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204036.68533: variable 'interface' from source: task vars 9396 1727204036.68538: variable 'dhcp_interface2' from source: play vars 9396 1727204036.68635: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:53:56 -0400 (0:00:00.468) 0:00:12.657 ***** 9396 1727204036.68675: entering _queue_task() for managed-node1/assert 9396 1727204036.69081: worker is 1 (out of 1 available) 9396 1727204036.69098: exiting _queue_task() for managed-node1/assert 9396 1727204036.69117: done queuing things up, now waiting for results queue to drain 9396 1727204036.69119: waiting for pending results... 9396 1727204036.69415: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test2' 9396 1727204036.69543: in run() - task 12b410aa-8751-36c5-1f9e-00000000001c 9396 1727204036.69560: variable 'ansible_search_path' from source: unknown 9396 1727204036.69564: variable 'ansible_search_path' from source: unknown 9396 1727204036.69620: calling self._execute() 9396 1727204036.69725: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.69732: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.69744: variable 'omit' from source: magic vars 9396 1727204036.70190: variable 'ansible_distribution_major_version' from source: facts 9396 1727204036.70205: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204036.70215: variable 'omit' from source: magic vars 9396 1727204036.70286: variable 'omit' from source: magic vars 9396 1727204036.70421: variable 'interface' from source: task vars 9396 1727204036.70425: variable 'dhcp_interface2' from source: play vars 9396 1727204036.70520: variable 'dhcp_interface2' from source: play vars 9396 1727204036.70545: variable 'omit' from source: magic vars 9396 1727204036.70794: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204036.70798: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204036.70800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204036.70802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204036.70805: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204036.70807: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204036.70809: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.70811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.70936: Set connection var ansible_timeout to 10 9396 1727204036.70944: Set connection var ansible_shell_executable to /bin/sh 9396 1727204036.70956: Set connection var ansible_pipelining to False 9396 1727204036.70964: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204036.70972: Set connection var ansible_connection to ssh 9396 1727204036.70975: Set connection var ansible_shell_type to sh 9396 1727204036.71015: variable 'ansible_shell_executable' from source: unknown 9396 1727204036.71024: variable 'ansible_connection' from source: unknown 9396 1727204036.71036: variable 'ansible_module_compression' from source: unknown 9396 1727204036.71039: variable 'ansible_shell_type' from source: unknown 9396 1727204036.71042: variable 'ansible_shell_executable' from source: unknown 9396 1727204036.71047: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.71052: variable 'ansible_pipelining' from source: unknown 9396 1727204036.71055: variable 'ansible_timeout' from source: unknown 9396 1727204036.71062: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.71267: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204036.71281: variable 'omit' from source: magic vars 9396 1727204036.71291: starting attempt loop 9396 1727204036.71295: running the handler 9396 1727204036.71492: variable 'interface_stat' from source: set_fact 9396 1727204036.71519: Evaluated conditional (interface_stat.stat.exists): True 9396 1727204036.71528: handler run complete 9396 1727204036.71547: attempt loop complete, returning result 9396 1727204036.71557: _execute() done 9396 1727204036.71565: dumping result to json 9396 1727204036.71568: done dumping result, returning 9396 1727204036.71585: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'test2' [12b410aa-8751-36c5-1f9e-00000000001c] 9396 1727204036.71593: sending task result for task 12b410aa-8751-36c5-1f9e-00000000001c ok: [managed-node1] => { "changed": false } MSG: All assertions passed 9396 1727204036.71743: no more pending results, returning what we have 9396 1727204036.71747: results queue empty 9396 1727204036.71748: checking for any_errors_fatal 9396 1727204036.71758: done checking for any_errors_fatal 9396 1727204036.71759: checking for max_fail_percentage 9396 1727204036.71761: done checking for max_fail_percentage 9396 1727204036.71762: checking to see if all hosts have failed and the running result is not ok 9396 1727204036.71763: done checking to see if all hosts have failed 9396 1727204036.71764: getting the remaining hosts for this loop 9396 1727204036.71765: done getting the remaining hosts for this loop 9396 1727204036.71770: getting the next task for host managed-node1 9396 1727204036.71779: done getting next task for host managed-node1 9396 1727204036.71782: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 9396 1727204036.71784: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204036.71788: getting variables 9396 1727204036.71790: in VariableManager get_vars() 9396 1727204036.71837: Calling all_inventory to load vars for managed-node1 9396 1727204036.71840: Calling groups_inventory to load vars for managed-node1 9396 1727204036.71843: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204036.71857: Calling all_plugins_play to load vars for managed-node1 9396 1727204036.71861: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204036.71866: Calling groups_plugins_play to load vars for managed-node1 9396 1727204036.72288: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000001c 9396 1727204036.72295: WORKER PROCESS EXITING 9396 1727204036.72321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204036.72892: done with get_vars() 9396 1727204036.72903: done getting variables 9396 1727204036.72984: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:28 Tuesday 24 September 2024 14:53:56 -0400 (0:00:00.043) 0:00:12.701 ***** 9396 1727204036.73019: entering _queue_task() for managed-node1/command 9396 1727204036.73413: worker is 1 (out of 1 available) 9396 1727204036.73424: exiting _queue_task() for managed-node1/command 9396 1727204036.73437: done queuing things up, now waiting for results queue to drain 9396 1727204036.73439: waiting for pending results... 9396 1727204036.73707: running TaskExecutor() for managed-node1/TASK: Backup the /etc/resolv.conf for initscript 9396 1727204036.73739: in run() - task 12b410aa-8751-36c5-1f9e-00000000001d 9396 1727204036.73755: variable 'ansible_search_path' from source: unknown 9396 1727204036.73806: calling self._execute() 9396 1727204036.73916: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.73924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.73942: variable 'omit' from source: magic vars 9396 1727204036.74427: variable 'ansible_distribution_major_version' from source: facts 9396 1727204036.74481: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204036.74629: variable 'network_provider' from source: set_fact 9396 1727204036.74636: Evaluated conditional (network_provider == "initscripts"): False 9396 1727204036.74639: when evaluation is False, skipping this task 9396 1727204036.74666: _execute() done 9396 1727204036.74669: dumping result to json 9396 1727204036.74672: done dumping result, returning 9396 1727204036.74675: done running TaskExecutor() for managed-node1/TASK: Backup the /etc/resolv.conf for initscript [12b410aa-8751-36c5-1f9e-00000000001d] 9396 1727204036.74678: sending task result for task 12b410aa-8751-36c5-1f9e-00000000001d 9396 1727204036.74830: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000001d skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 9396 1727204036.74892: no more pending results, returning what we have 9396 1727204036.74896: results queue empty 9396 1727204036.74897: checking for any_errors_fatal 9396 1727204036.74903: done checking for any_errors_fatal 9396 1727204036.74904: checking for max_fail_percentage 9396 1727204036.74906: done checking for max_fail_percentage 9396 1727204036.74907: checking to see if all hosts have failed and the running result is not ok 9396 1727204036.74914: done checking to see if all hosts have failed 9396 1727204036.74915: getting the remaining hosts for this loop 9396 1727204036.74916: done getting the remaining hosts for this loop 9396 1727204036.74920: getting the next task for host managed-node1 9396 1727204036.74924: done getting next task for host managed-node1 9396 1727204036.74927: ^ task is: TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 9396 1727204036.74929: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204036.74932: getting variables 9396 1727204036.74934: in VariableManager get_vars() 9396 1727204036.74969: Calling all_inventory to load vars for managed-node1 9396 1727204036.74972: Calling groups_inventory to load vars for managed-node1 9396 1727204036.74974: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204036.74994: Calling all_plugins_play to load vars for managed-node1 9396 1727204036.74999: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204036.75003: Calling groups_plugins_play to load vars for managed-node1 9396 1727204036.75141: WORKER PROCESS EXITING 9396 1727204036.75154: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204036.75319: done with get_vars() 9396 1727204036.75328: done getting variables 9396 1727204036.75373: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports using deprecated 'master' argument] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:33 Tuesday 24 September 2024 14:53:56 -0400 (0:00:00.023) 0:00:12.724 ***** 9396 1727204036.75397: entering _queue_task() for managed-node1/debug 9396 1727204036.75603: worker is 1 (out of 1 available) 9396 1727204036.75621: exiting _queue_task() for managed-node1/debug 9396 1727204036.75633: done queuing things up, now waiting for results queue to drain 9396 1727204036.75635: waiting for pending results... 9396 1727204036.75793: running TaskExecutor() for managed-node1/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument 9396 1727204036.75859: in run() - task 12b410aa-8751-36c5-1f9e-00000000001e 9396 1727204036.75876: variable 'ansible_search_path' from source: unknown 9396 1727204036.75905: calling self._execute() 9396 1727204036.75979: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.75985: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.75996: variable 'omit' from source: magic vars 9396 1727204036.76275: variable 'ansible_distribution_major_version' from source: facts 9396 1727204036.76286: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204036.76294: variable 'omit' from source: magic vars 9396 1727204036.76314: variable 'omit' from source: magic vars 9396 1727204036.76347: variable 'omit' from source: magic vars 9396 1727204036.76378: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204036.76413: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204036.76433: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204036.76450: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204036.76461: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204036.76487: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204036.76493: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.76496: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.76581: Set connection var ansible_timeout to 10 9396 1727204036.76587: Set connection var ansible_shell_executable to /bin/sh 9396 1727204036.76598: Set connection var ansible_pipelining to False 9396 1727204036.76604: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204036.76614: Set connection var ansible_connection to ssh 9396 1727204036.76617: Set connection var ansible_shell_type to sh 9396 1727204036.76652: variable 'ansible_shell_executable' from source: unknown 9396 1727204036.76655: variable 'ansible_connection' from source: unknown 9396 1727204036.76662: variable 'ansible_module_compression' from source: unknown 9396 1727204036.76665: variable 'ansible_shell_type' from source: unknown 9396 1727204036.76667: variable 'ansible_shell_executable' from source: unknown 9396 1727204036.76672: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.76677: variable 'ansible_pipelining' from source: unknown 9396 1727204036.76680: variable 'ansible_timeout' from source: unknown 9396 1727204036.76685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.76942: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204036.76947: variable 'omit' from source: magic vars 9396 1727204036.76951: starting attempt loop 9396 1727204036.76953: running the handler 9396 1727204036.76956: handler run complete 9396 1727204036.76958: attempt loop complete, returning result 9396 1727204036.76961: _execute() done 9396 1727204036.76963: dumping result to json 9396 1727204036.76965: done dumping result, returning 9396 1727204036.76976: done running TaskExecutor() for managed-node1/TASK: TEST Add Bond with 2 ports using deprecated 'master' argument [12b410aa-8751-36c5-1f9e-00000000001e] 9396 1727204036.76985: sending task result for task 12b410aa-8751-36c5-1f9e-00000000001e 9396 1727204036.77254: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000001e 9396 1727204036.77257: WORKER PROCESS EXITING ok: [managed-node1] => {} MSG: ################################################## 9396 1727204036.77327: no more pending results, returning what we have 9396 1727204036.77331: results queue empty 9396 1727204036.77332: checking for any_errors_fatal 9396 1727204036.77336: done checking for any_errors_fatal 9396 1727204036.77337: checking for max_fail_percentage 9396 1727204036.77338: done checking for max_fail_percentage 9396 1727204036.77339: checking to see if all hosts have failed and the running result is not ok 9396 1727204036.77340: done checking to see if all hosts have failed 9396 1727204036.77341: getting the remaining hosts for this loop 9396 1727204036.77343: done getting the remaining hosts for this loop 9396 1727204036.77346: getting the next task for host managed-node1 9396 1727204036.77352: done getting next task for host managed-node1 9396 1727204036.77357: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 9396 1727204036.77361: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204036.77383: getting variables 9396 1727204036.77385: in VariableManager get_vars() 9396 1727204036.77428: Calling all_inventory to load vars for managed-node1 9396 1727204036.77431: Calling groups_inventory to load vars for managed-node1 9396 1727204036.77434: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204036.77443: Calling all_plugins_play to load vars for managed-node1 9396 1727204036.77447: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204036.77451: Calling groups_plugins_play to load vars for managed-node1 9396 1727204036.77729: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204036.77918: done with get_vars() 9396 1727204036.77926: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:53:56 -0400 (0:00:00.026) 0:00:12.750 ***** 9396 1727204036.78001: entering _queue_task() for managed-node1/include_tasks 9396 1727204036.78185: worker is 1 (out of 1 available) 9396 1727204036.78200: exiting _queue_task() for managed-node1/include_tasks 9396 1727204036.78215: done queuing things up, now waiting for results queue to drain 9396 1727204036.78217: waiting for pending results... 9396 1727204036.78384: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 9396 1727204036.78479: in run() - task 12b410aa-8751-36c5-1f9e-000000000026 9396 1727204036.78493: variable 'ansible_search_path' from source: unknown 9396 1727204036.78497: variable 'ansible_search_path' from source: unknown 9396 1727204036.78529: calling self._execute() 9396 1727204036.78600: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.78609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.78617: variable 'omit' from source: magic vars 9396 1727204036.78913: variable 'ansible_distribution_major_version' from source: facts 9396 1727204036.78923: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204036.78929: _execute() done 9396 1727204036.78935: dumping result to json 9396 1727204036.78938: done dumping result, returning 9396 1727204036.78946: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-36c5-1f9e-000000000026] 9396 1727204036.78959: sending task result for task 12b410aa-8751-36c5-1f9e-000000000026 9396 1727204036.79054: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000026 9396 1727204036.79057: WORKER PROCESS EXITING 9396 1727204036.79101: no more pending results, returning what we have 9396 1727204036.79106: in VariableManager get_vars() 9396 1727204036.79149: Calling all_inventory to load vars for managed-node1 9396 1727204036.79153: Calling groups_inventory to load vars for managed-node1 9396 1727204036.79156: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204036.79165: Calling all_plugins_play to load vars for managed-node1 9396 1727204036.79175: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204036.79178: Calling groups_plugins_play to load vars for managed-node1 9396 1727204036.79318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204036.79477: done with get_vars() 9396 1727204036.79483: variable 'ansible_search_path' from source: unknown 9396 1727204036.79484: variable 'ansible_search_path' from source: unknown 9396 1727204036.79520: we have included files to process 9396 1727204036.79521: generating all_blocks data 9396 1727204036.79523: done generating all_blocks data 9396 1727204036.79526: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 9396 1727204036.79527: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 9396 1727204036.79529: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 9396 1727204036.80124: done processing included file 9396 1727204036.80125: iterating over new_blocks loaded from include file 9396 1727204036.80127: in VariableManager get_vars() 9396 1727204036.80145: done with get_vars() 9396 1727204036.80148: filtering new block on tags 9396 1727204036.80164: done filtering new block on tags 9396 1727204036.80166: in VariableManager get_vars() 9396 1727204036.80184: done with get_vars() 9396 1727204036.80186: filtering new block on tags 9396 1727204036.80203: done filtering new block on tags 9396 1727204036.80205: in VariableManager get_vars() 9396 1727204036.80223: done with get_vars() 9396 1727204036.80224: filtering new block on tags 9396 1727204036.80238: done filtering new block on tags 9396 1727204036.80240: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 9396 1727204036.80243: extending task lists for all hosts with included blocks 9396 1727204036.81157: done extending task lists 9396 1727204036.81158: done processing included files 9396 1727204036.81159: results queue empty 9396 1727204036.81160: checking for any_errors_fatal 9396 1727204036.81164: done checking for any_errors_fatal 9396 1727204036.81165: checking for max_fail_percentage 9396 1727204036.81166: done checking for max_fail_percentage 9396 1727204036.81167: checking to see if all hosts have failed and the running result is not ok 9396 1727204036.81168: done checking to see if all hosts have failed 9396 1727204036.81169: getting the remaining hosts for this loop 9396 1727204036.81170: done getting the remaining hosts for this loop 9396 1727204036.81173: getting the next task for host managed-node1 9396 1727204036.81177: done getting next task for host managed-node1 9396 1727204036.81180: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 9396 1727204036.81184: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204036.81196: getting variables 9396 1727204036.81197: in VariableManager get_vars() 9396 1727204036.81216: Calling all_inventory to load vars for managed-node1 9396 1727204036.81219: Calling groups_inventory to load vars for managed-node1 9396 1727204036.81221: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204036.81227: Calling all_plugins_play to load vars for managed-node1 9396 1727204036.81231: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204036.81234: Calling groups_plugins_play to load vars for managed-node1 9396 1727204036.81443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204036.81720: done with get_vars() 9396 1727204036.81731: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:53:56 -0400 (0:00:00.038) 0:00:12.789 ***** 9396 1727204036.81817: entering _queue_task() for managed-node1/setup 9396 1727204036.82095: worker is 1 (out of 1 available) 9396 1727204036.82108: exiting _queue_task() for managed-node1/setup 9396 1727204036.82122: done queuing things up, now waiting for results queue to drain 9396 1727204036.82124: waiting for pending results... 9396 1727204036.82442: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 9396 1727204036.82565: in run() - task 12b410aa-8751-36c5-1f9e-000000000189 9396 1727204036.82579: variable 'ansible_search_path' from source: unknown 9396 1727204036.82583: variable 'ansible_search_path' from source: unknown 9396 1727204036.82631: calling self._execute() 9396 1727204036.82713: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.82736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.82742: variable 'omit' from source: magic vars 9396 1727204036.83184: variable 'ansible_distribution_major_version' from source: facts 9396 1727204036.83199: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204036.83498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204036.86138: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204036.86216: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204036.86495: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204036.86499: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204036.86502: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204036.86505: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204036.86510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204036.86512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204036.86799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204036.86803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204036.86806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204036.86814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204036.86817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204036.86820: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204036.86822: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204036.86950: variable '__network_required_facts' from source: role '' defaults 9396 1727204036.86960: variable 'ansible_facts' from source: unknown 9396 1727204036.87079: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 9396 1727204036.87083: when evaluation is False, skipping this task 9396 1727204036.87088: _execute() done 9396 1727204036.87095: dumping result to json 9396 1727204036.87099: done dumping result, returning 9396 1727204036.87112: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-36c5-1f9e-000000000189] 9396 1727204036.87119: sending task result for task 12b410aa-8751-36c5-1f9e-000000000189 9396 1727204036.87366: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000189 9396 1727204036.87369: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 9396 1727204036.87420: no more pending results, returning what we have 9396 1727204036.87425: results queue empty 9396 1727204036.87426: checking for any_errors_fatal 9396 1727204036.87428: done checking for any_errors_fatal 9396 1727204036.87429: checking for max_fail_percentage 9396 1727204036.87431: done checking for max_fail_percentage 9396 1727204036.87432: checking to see if all hosts have failed and the running result is not ok 9396 1727204036.87433: done checking to see if all hosts have failed 9396 1727204036.87434: getting the remaining hosts for this loop 9396 1727204036.87435: done getting the remaining hosts for this loop 9396 1727204036.87440: getting the next task for host managed-node1 9396 1727204036.87451: done getting next task for host managed-node1 9396 1727204036.87455: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 9396 1727204036.87460: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204036.87476: getting variables 9396 1727204036.87478: in VariableManager get_vars() 9396 1727204036.87529: Calling all_inventory to load vars for managed-node1 9396 1727204036.87532: Calling groups_inventory to load vars for managed-node1 9396 1727204036.87536: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204036.87549: Calling all_plugins_play to load vars for managed-node1 9396 1727204036.87553: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204036.87557: Calling groups_plugins_play to load vars for managed-node1 9396 1727204036.87949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204036.88234: done with get_vars() 9396 1727204036.88247: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:53:56 -0400 (0:00:00.065) 0:00:12.854 ***** 9396 1727204036.88367: entering _queue_task() for managed-node1/stat 9396 1727204036.89080: worker is 1 (out of 1 available) 9396 1727204036.89302: exiting _queue_task() for managed-node1/stat 9396 1727204036.89315: done queuing things up, now waiting for results queue to drain 9396 1727204036.89318: waiting for pending results... 9396 1727204036.89672: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 9396 1727204036.89875: in run() - task 12b410aa-8751-36c5-1f9e-00000000018b 9396 1727204036.89882: variable 'ansible_search_path' from source: unknown 9396 1727204036.89886: variable 'ansible_search_path' from source: unknown 9396 1727204036.89917: calling self._execute() 9396 1727204036.90021: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.90088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.90097: variable 'omit' from source: magic vars 9396 1727204036.90525: variable 'ansible_distribution_major_version' from source: facts 9396 1727204036.90551: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204036.90791: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204036.91223: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204036.91278: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204036.91397: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204036.91443: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204036.91681: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204036.91770: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204036.91878: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204036.92066: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204036.92155: variable '__network_is_ostree' from source: set_fact 9396 1727204036.92220: Evaluated conditional (not __network_is_ostree is defined): False 9396 1727204036.92317: when evaluation is False, skipping this task 9396 1727204036.92321: _execute() done 9396 1727204036.92324: dumping result to json 9396 1727204036.92328: done dumping result, returning 9396 1727204036.92339: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-36c5-1f9e-00000000018b] 9396 1727204036.92353: sending task result for task 12b410aa-8751-36c5-1f9e-00000000018b skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 9396 1727204036.92527: no more pending results, returning what we have 9396 1727204036.92533: results queue empty 9396 1727204036.92534: checking for any_errors_fatal 9396 1727204036.92541: done checking for any_errors_fatal 9396 1727204036.92542: checking for max_fail_percentage 9396 1727204036.92544: done checking for max_fail_percentage 9396 1727204036.92545: checking to see if all hosts have failed and the running result is not ok 9396 1727204036.92546: done checking to see if all hosts have failed 9396 1727204036.92546: getting the remaining hosts for this loop 9396 1727204036.92548: done getting the remaining hosts for this loop 9396 1727204036.92552: getting the next task for host managed-node1 9396 1727204036.92559: done getting next task for host managed-node1 9396 1727204036.92563: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 9396 1727204036.92567: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204036.92582: getting variables 9396 1727204036.92584: in VariableManager get_vars() 9396 1727204036.92631: Calling all_inventory to load vars for managed-node1 9396 1727204036.92635: Calling groups_inventory to load vars for managed-node1 9396 1727204036.92638: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204036.92650: Calling all_plugins_play to load vars for managed-node1 9396 1727204036.92654: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204036.92658: Calling groups_plugins_play to load vars for managed-node1 9396 1727204036.93313: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000018b 9396 1727204036.93317: WORKER PROCESS EXITING 9396 1727204036.93507: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204036.94006: done with get_vars() 9396 1727204036.94018: done getting variables 9396 1727204036.94085: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:53:56 -0400 (0:00:00.059) 0:00:12.914 ***** 9396 1727204036.94328: entering _queue_task() for managed-node1/set_fact 9396 1727204036.94818: worker is 1 (out of 1 available) 9396 1727204036.94831: exiting _queue_task() for managed-node1/set_fact 9396 1727204036.94844: done queuing things up, now waiting for results queue to drain 9396 1727204036.94846: waiting for pending results... 9396 1727204036.95375: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 9396 1727204036.95813: in run() - task 12b410aa-8751-36c5-1f9e-00000000018c 9396 1727204036.95837: variable 'ansible_search_path' from source: unknown 9396 1727204036.95956: variable 'ansible_search_path' from source: unknown 9396 1727204036.95959: calling self._execute() 9396 1727204036.96146: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204036.96181: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204036.96201: variable 'omit' from source: magic vars 9396 1727204036.97296: variable 'ansible_distribution_major_version' from source: facts 9396 1727204036.97301: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204036.97895: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204036.98327: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204036.98504: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204036.98554: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204036.98691: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204036.98913: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204036.98949: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204036.98985: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204036.99219: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204036.99363: variable '__network_is_ostree' from source: set_fact 9396 1727204036.99377: Evaluated conditional (not __network_is_ostree is defined): False 9396 1727204036.99387: when evaluation is False, skipping this task 9396 1727204036.99399: _execute() done 9396 1727204036.99411: dumping result to json 9396 1727204036.99443: done dumping result, returning 9396 1727204036.99458: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-36c5-1f9e-00000000018c] 9396 1727204036.99595: sending task result for task 12b410aa-8751-36c5-1f9e-00000000018c 9396 1727204036.99793: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000018c 9396 1727204036.99798: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 9396 1727204036.99854: no more pending results, returning what we have 9396 1727204036.99859: results queue empty 9396 1727204036.99860: checking for any_errors_fatal 9396 1727204036.99867: done checking for any_errors_fatal 9396 1727204036.99869: checking for max_fail_percentage 9396 1727204036.99871: done checking for max_fail_percentage 9396 1727204036.99872: checking to see if all hosts have failed and the running result is not ok 9396 1727204036.99874: done checking to see if all hosts have failed 9396 1727204036.99875: getting the remaining hosts for this loop 9396 1727204036.99876: done getting the remaining hosts for this loop 9396 1727204036.99882: getting the next task for host managed-node1 9396 1727204036.99894: done getting next task for host managed-node1 9396 1727204036.99898: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 9396 1727204036.99902: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204036.99917: getting variables 9396 1727204036.99919: in VariableManager get_vars() 9396 1727204036.99965: Calling all_inventory to load vars for managed-node1 9396 1727204036.99968: Calling groups_inventory to load vars for managed-node1 9396 1727204036.99971: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204036.99981: Calling all_plugins_play to load vars for managed-node1 9396 1727204036.99985: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204037.00292: Calling groups_plugins_play to load vars for managed-node1 9396 1727204037.00720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204037.01213: done with get_vars() 9396 1727204037.01227: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:53:57 -0400 (0:00:00.072) 0:00:12.986 ***** 9396 1727204037.01549: entering _queue_task() for managed-node1/service_facts 9396 1727204037.01552: Creating lock for service_facts 9396 1727204037.02053: worker is 1 (out of 1 available) 9396 1727204037.02066: exiting _queue_task() for managed-node1/service_facts 9396 1727204037.02082: done queuing things up, now waiting for results queue to drain 9396 1727204037.02084: waiting for pending results... 9396 1727204037.02569: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 9396 1727204037.03001: in run() - task 12b410aa-8751-36c5-1f9e-00000000018e 9396 1727204037.03006: variable 'ansible_search_path' from source: unknown 9396 1727204037.03009: variable 'ansible_search_path' from source: unknown 9396 1727204037.03153: calling self._execute() 9396 1727204037.03225: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204037.03497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204037.03502: variable 'omit' from source: magic vars 9396 1727204037.04349: variable 'ansible_distribution_major_version' from source: facts 9396 1727204037.04367: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204037.04421: variable 'omit' from source: magic vars 9396 1727204037.04651: variable 'omit' from source: magic vars 9396 1727204037.04745: variable 'omit' from source: magic vars 9396 1727204037.04795: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204037.04888: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204037.04977: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204037.05072: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204037.05095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204037.05140: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204037.05178: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204037.05190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204037.05464: Set connection var ansible_timeout to 10 9396 1727204037.05509: Set connection var ansible_shell_executable to /bin/sh 9396 1727204037.05616: Set connection var ansible_pipelining to False 9396 1727204037.05620: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204037.05622: Set connection var ansible_connection to ssh 9396 1727204037.05624: Set connection var ansible_shell_type to sh 9396 1727204037.05626: variable 'ansible_shell_executable' from source: unknown 9396 1727204037.05633: variable 'ansible_connection' from source: unknown 9396 1727204037.05635: variable 'ansible_module_compression' from source: unknown 9396 1727204037.05638: variable 'ansible_shell_type' from source: unknown 9396 1727204037.05640: variable 'ansible_shell_executable' from source: unknown 9396 1727204037.05643: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204037.05645: variable 'ansible_pipelining' from source: unknown 9396 1727204037.05647: variable 'ansible_timeout' from source: unknown 9396 1727204037.05649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204037.05896: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204037.05919: variable 'omit' from source: magic vars 9396 1727204037.05931: starting attempt loop 9396 1727204037.05945: running the handler 9396 1727204037.05968: _low_level_execute_command(): starting 9396 1727204037.05983: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204037.06843: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204037.06848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204037.06907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204037.06951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204037.07030: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204037.08809: stdout chunk (state=3): >>>/root <<< 9396 1727204037.08985: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204037.09006: stderr chunk (state=3): >>><<< 9396 1727204037.09029: stdout chunk (state=3): >>><<< 9396 1727204037.09096: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204037.09099: _low_level_execute_command(): starting 9396 1727204037.09103: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586 `" && echo ansible-tmp-1727204037.0905485-10748-204915896132586="` echo /root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586 `" ) && sleep 0' 9396 1727204037.09733: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204037.09791: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204037.09903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204037.09941: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204037.09998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204037.12076: stdout chunk (state=3): >>>ansible-tmp-1727204037.0905485-10748-204915896132586=/root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586 <<< 9396 1727204037.12204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204037.12243: stderr chunk (state=3): >>><<< 9396 1727204037.12247: stdout chunk (state=3): >>><<< 9396 1727204037.12262: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204037.0905485-10748-204915896132586=/root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204037.12307: variable 'ansible_module_compression' from source: unknown 9396 1727204037.12349: ANSIBALLZ: Using lock for service_facts 9396 1727204037.12353: ANSIBALLZ: Acquiring lock 9396 1727204037.12355: ANSIBALLZ: Lock acquired: 139797140921552 9396 1727204037.12358: ANSIBALLZ: Creating module 9396 1727204037.28409: ANSIBALLZ: Writing module into payload 9396 1727204037.28494: ANSIBALLZ: Writing module 9396 1727204037.28516: ANSIBALLZ: Renaming module 9396 1727204037.28523: ANSIBALLZ: Done creating module 9396 1727204037.28539: variable 'ansible_facts' from source: unknown 9396 1727204037.28589: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586/AnsiballZ_service_facts.py 9396 1727204037.28719: Sending initial data 9396 1727204037.28722: Sent initial data (161 bytes) 9396 1727204037.29185: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204037.29221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204037.29224: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204037.29227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 9396 1727204037.29229: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204037.29287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204037.29298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204037.29356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204037.31075: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204037.31111: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204037.31159: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmprmo0l8mr /root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586/AnsiballZ_service_facts.py <<< 9396 1727204037.31162: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586/AnsiballZ_service_facts.py" <<< 9396 1727204037.31196: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmprmo0l8mr" to remote "/root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586/AnsiballZ_service_facts.py" <<< 9396 1727204037.32026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204037.32095: stderr chunk (state=3): >>><<< 9396 1727204037.32098: stdout chunk (state=3): >>><<< 9396 1727204037.32111: done transferring module to remote 9396 1727204037.32123: _low_level_execute_command(): starting 9396 1727204037.32128: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586/ /root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586/AnsiballZ_service_facts.py && sleep 0' 9396 1727204037.32580: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204037.32622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204037.32626: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204037.32629: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204037.32632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204037.32683: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204037.32694: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204037.32735: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204037.34690: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204037.34734: stderr chunk (state=3): >>><<< 9396 1727204037.34766: stdout chunk (state=3): >>><<< 9396 1727204037.34791: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204037.34800: _low_level_execute_command(): starting 9396 1727204037.34806: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586/AnsiballZ_service_facts.py && sleep 0' 9396 1727204037.35415: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204037.35418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204037.35421: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204037.35425: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204037.35476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204037.35480: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204037.35528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204039.43706: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 9396 1727204039.43734: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service"<<< 9396 1727204039.43750: stdout chunk (state=3): >>>, "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name<<< 9396 1727204039.43757: stdout chunk (state=3): >>>": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inact<<< 9396 1727204039.43791: stdout chunk (state=3): >>>ive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd<<< 9396 1727204039.43794: stdout chunk (state=3): >>>"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 9396 1727204039.45467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204039.45537: stderr chunk (state=3): >>><<< 9396 1727204039.45542: stdout chunk (state=3): >>><<< 9396 1727204039.45566: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204039.46167: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204039.46178: _low_level_execute_command(): starting 9396 1727204039.46184: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204037.0905485-10748-204915896132586/ > /dev/null 2>&1 && sleep 0' 9396 1727204039.46678: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204039.46681: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204039.46684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204039.46686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204039.46691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204039.46746: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204039.46750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204039.46801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204039.48758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204039.48822: stderr chunk (state=3): >>><<< 9396 1727204039.48825: stdout chunk (state=3): >>><<< 9396 1727204039.48840: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204039.48848: handler run complete 9396 1727204039.49026: variable 'ansible_facts' from source: unknown 9396 1727204039.50116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204039.50543: variable 'ansible_facts' from source: unknown 9396 1727204039.50671: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204039.50874: attempt loop complete, returning result 9396 1727204039.50880: _execute() done 9396 1727204039.50883: dumping result to json 9396 1727204039.50934: done dumping result, returning 9396 1727204039.50945: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-36c5-1f9e-00000000018e] 9396 1727204039.50948: sending task result for task 12b410aa-8751-36c5-1f9e-00000000018e ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 9396 1727204039.51734: no more pending results, returning what we have 9396 1727204039.51737: results queue empty 9396 1727204039.51737: checking for any_errors_fatal 9396 1727204039.51740: done checking for any_errors_fatal 9396 1727204039.51741: checking for max_fail_percentage 9396 1727204039.51742: done checking for max_fail_percentage 9396 1727204039.51742: checking to see if all hosts have failed and the running result is not ok 9396 1727204039.51743: done checking to see if all hosts have failed 9396 1727204039.51744: getting the remaining hosts for this loop 9396 1727204039.51745: done getting the remaining hosts for this loop 9396 1727204039.51747: getting the next task for host managed-node1 9396 1727204039.51751: done getting next task for host managed-node1 9396 1727204039.51754: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 9396 1727204039.51757: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204039.51767: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000018e 9396 1727204039.51771: WORKER PROCESS EXITING 9396 1727204039.51775: getting variables 9396 1727204039.51776: in VariableManager get_vars() 9396 1727204039.51804: Calling all_inventory to load vars for managed-node1 9396 1727204039.51807: Calling groups_inventory to load vars for managed-node1 9396 1727204039.51814: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204039.51823: Calling all_plugins_play to load vars for managed-node1 9396 1727204039.51825: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204039.51827: Calling groups_plugins_play to load vars for managed-node1 9396 1727204039.52144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204039.52660: done with get_vars() 9396 1727204039.52671: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:53:59 -0400 (0:00:02.512) 0:00:15.498 ***** 9396 1727204039.52756: entering _queue_task() for managed-node1/package_facts 9396 1727204039.52761: Creating lock for package_facts 9396 1727204039.53008: worker is 1 (out of 1 available) 9396 1727204039.53023: exiting _queue_task() for managed-node1/package_facts 9396 1727204039.53036: done queuing things up, now waiting for results queue to drain 9396 1727204039.53038: waiting for pending results... 9396 1727204039.53222: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 9396 1727204039.53338: in run() - task 12b410aa-8751-36c5-1f9e-00000000018f 9396 1727204039.53349: variable 'ansible_search_path' from source: unknown 9396 1727204039.53352: variable 'ansible_search_path' from source: unknown 9396 1727204039.53385: calling self._execute() 9396 1727204039.53458: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204039.53462: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204039.53474: variable 'omit' from source: magic vars 9396 1727204039.53791: variable 'ansible_distribution_major_version' from source: facts 9396 1727204039.53803: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204039.53814: variable 'omit' from source: magic vars 9396 1727204039.53871: variable 'omit' from source: magic vars 9396 1727204039.53902: variable 'omit' from source: magic vars 9396 1727204039.53941: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204039.53973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204039.53993: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204039.54012: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204039.54022: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204039.54054: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204039.54058: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204039.54060: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204039.54146: Set connection var ansible_timeout to 10 9396 1727204039.54156: Set connection var ansible_shell_executable to /bin/sh 9396 1727204039.54162: Set connection var ansible_pipelining to False 9396 1727204039.54170: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204039.54177: Set connection var ansible_connection to ssh 9396 1727204039.54180: Set connection var ansible_shell_type to sh 9396 1727204039.54204: variable 'ansible_shell_executable' from source: unknown 9396 1727204039.54209: variable 'ansible_connection' from source: unknown 9396 1727204039.54213: variable 'ansible_module_compression' from source: unknown 9396 1727204039.54215: variable 'ansible_shell_type' from source: unknown 9396 1727204039.54218: variable 'ansible_shell_executable' from source: unknown 9396 1727204039.54220: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204039.54226: variable 'ansible_pipelining' from source: unknown 9396 1727204039.54228: variable 'ansible_timeout' from source: unknown 9396 1727204039.54234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204039.54402: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204039.54467: variable 'omit' from source: magic vars 9396 1727204039.54470: starting attempt loop 9396 1727204039.54475: running the handler 9396 1727204039.54477: _low_level_execute_command(): starting 9396 1727204039.54480: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204039.54992: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204039.54996: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204039.54999: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204039.55001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204039.55004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204039.55055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204039.55058: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204039.55069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204039.55129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204039.56892: stdout chunk (state=3): >>>/root <<< 9396 1727204039.56993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204039.57055: stderr chunk (state=3): >>><<< 9396 1727204039.57059: stdout chunk (state=3): >>><<< 9396 1727204039.57085: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204039.57100: _low_level_execute_command(): starting 9396 1727204039.57113: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528 `" && echo ansible-tmp-1727204039.570849-10833-180800826089528="` echo /root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528 `" ) && sleep 0' 9396 1727204039.57595: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204039.57598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204039.57601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 9396 1727204039.57612: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204039.57617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204039.57667: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204039.57674: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204039.57718: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204039.59773: stdout chunk (state=3): >>>ansible-tmp-1727204039.570849-10833-180800826089528=/root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528 <<< 9396 1727204039.59891: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204039.59948: stderr chunk (state=3): >>><<< 9396 1727204039.59952: stdout chunk (state=3): >>><<< 9396 1727204039.59967: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204039.570849-10833-180800826089528=/root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204039.60018: variable 'ansible_module_compression' from source: unknown 9396 1727204039.60064: ANSIBALLZ: Using lock for package_facts 9396 1727204039.60068: ANSIBALLZ: Acquiring lock 9396 1727204039.60071: ANSIBALLZ: Lock acquired: 139797140665120 9396 1727204039.60074: ANSIBALLZ: Creating module 9396 1727204039.91697: ANSIBALLZ: Writing module into payload 9396 1727204039.91710: ANSIBALLZ: Writing module 9396 1727204039.91755: ANSIBALLZ: Renaming module 9396 1727204039.91770: ANSIBALLZ: Done creating module 9396 1727204039.91822: variable 'ansible_facts' from source: unknown 9396 1727204039.92049: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528/AnsiballZ_package_facts.py 9396 1727204039.92330: Sending initial data 9396 1727204039.92343: Sent initial data (160 bytes) 9396 1727204039.92992: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204039.93015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204039.93106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204039.94830: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204039.94901: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204039.94944: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528/AnsiballZ_package_facts.py" <<< 9396 1727204039.94967: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpkkbxjvxm /root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528/AnsiballZ_package_facts.py <<< 9396 1727204039.95005: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpkkbxjvxm" to remote "/root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528/AnsiballZ_package_facts.py" <<< 9396 1727204039.97655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204039.97668: stdout chunk (state=3): >>><<< 9396 1727204039.97684: stderr chunk (state=3): >>><<< 9396 1727204039.97726: done transferring module to remote 9396 1727204039.97744: _low_level_execute_command(): starting 9396 1727204039.97754: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528/ /root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528/AnsiballZ_package_facts.py && sleep 0' 9396 1727204039.98478: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204039.98580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204039.98585: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204039.98636: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204039.98655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204039.98711: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204039.98788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204040.00871: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204040.00905: stdout chunk (state=3): >>><<< 9396 1727204040.00908: stderr chunk (state=3): >>><<< 9396 1727204040.00925: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204040.01025: _low_level_execute_command(): starting 9396 1727204040.01029: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528/AnsiballZ_package_facts.py && sleep 0' 9396 1727204040.01709: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204040.01801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204040.01822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204040.01867: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204040.01950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204040.66485: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 9396 1727204040.66502: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 9396 1727204040.66538: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 9396 1727204040.66567: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 9396 1727204040.66588: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 9396 1727204040.66616: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 9396 1727204040.66663: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5<<< 9396 1727204040.66670: stdout chunk (state=3): >>>", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 9396 1727204040.66673: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 9396 1727204040.66688: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name"<<< 9396 1727204040.66715: stdout chunk (state=3): >>>: "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 9396 1727204040.66725: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 9396 1727204040.66753: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 9396 1727204040.66767: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 9396 1727204040.68709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204040.68771: stderr chunk (state=3): >>><<< 9396 1727204040.68776: stdout chunk (state=3): >>><<< 9396 1727204040.68824: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204040.71310: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204040.71332: _low_level_execute_command(): starting 9396 1727204040.71337: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204039.570849-10833-180800826089528/ > /dev/null 2>&1 && sleep 0' 9396 1727204040.71845: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204040.71849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204040.71853: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204040.71856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204040.71920: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204040.71922: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204040.71924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204040.71966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204040.74449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204040.74453: stdout chunk (state=3): >>><<< 9396 1727204040.74456: stderr chunk (state=3): >>><<< 9396 1727204040.74459: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204040.74461: handler run complete 9396 1727204040.75906: variable 'ansible_facts' from source: unknown 9396 1727204040.76803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204040.79066: variable 'ansible_facts' from source: unknown 9396 1727204040.79490: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204040.80282: attempt loop complete, returning result 9396 1727204040.80305: _execute() done 9396 1727204040.80309: dumping result to json 9396 1727204040.80660: done dumping result, returning 9396 1727204040.80664: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-36c5-1f9e-00000000018f] 9396 1727204040.80667: sending task result for task 12b410aa-8751-36c5-1f9e-00000000018f 9396 1727204040.83054: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000018f 9396 1727204040.83058: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 9396 1727204040.83105: no more pending results, returning what we have 9396 1727204040.83108: results queue empty 9396 1727204040.83109: checking for any_errors_fatal 9396 1727204040.83113: done checking for any_errors_fatal 9396 1727204040.83114: checking for max_fail_percentage 9396 1727204040.83115: done checking for max_fail_percentage 9396 1727204040.83116: checking to see if all hosts have failed and the running result is not ok 9396 1727204040.83116: done checking to see if all hosts have failed 9396 1727204040.83117: getting the remaining hosts for this loop 9396 1727204040.83118: done getting the remaining hosts for this loop 9396 1727204040.83121: getting the next task for host managed-node1 9396 1727204040.83126: done getting next task for host managed-node1 9396 1727204040.83129: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 9396 1727204040.83131: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204040.83142: getting variables 9396 1727204040.83144: in VariableManager get_vars() 9396 1727204040.83173: Calling all_inventory to load vars for managed-node1 9396 1727204040.83175: Calling groups_inventory to load vars for managed-node1 9396 1727204040.83177: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204040.83184: Calling all_plugins_play to load vars for managed-node1 9396 1727204040.83187: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204040.83191: Calling groups_plugins_play to load vars for managed-node1 9396 1727204040.84502: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204040.87916: done with get_vars() 9396 1727204040.87952: done getting variables 9396 1727204040.88012: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:00 -0400 (0:00:01.352) 0:00:16.851 ***** 9396 1727204040.88041: entering _queue_task() for managed-node1/debug 9396 1727204040.88315: worker is 1 (out of 1 available) 9396 1727204040.88330: exiting _queue_task() for managed-node1/debug 9396 1727204040.88345: done queuing things up, now waiting for results queue to drain 9396 1727204040.88347: waiting for pending results... 9396 1727204040.88556: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 9396 1727204040.88652: in run() - task 12b410aa-8751-36c5-1f9e-000000000027 9396 1727204040.88666: variable 'ansible_search_path' from source: unknown 9396 1727204040.88669: variable 'ansible_search_path' from source: unknown 9396 1727204040.88710: calling self._execute() 9396 1727204040.88782: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204040.88794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204040.88803: variable 'omit' from source: magic vars 9396 1727204040.89126: variable 'ansible_distribution_major_version' from source: facts 9396 1727204040.89138: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204040.89146: variable 'omit' from source: magic vars 9396 1727204040.89200: variable 'omit' from source: magic vars 9396 1727204040.89297: variable 'network_provider' from source: set_fact 9396 1727204040.89315: variable 'omit' from source: magic vars 9396 1727204040.89354: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204040.89385: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204040.89411: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204040.89427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204040.89438: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204040.89474: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204040.89478: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204040.89481: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204040.89584: Set connection var ansible_timeout to 10 9396 1727204040.89592: Set connection var ansible_shell_executable to /bin/sh 9396 1727204040.89602: Set connection var ansible_pipelining to False 9396 1727204040.89699: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204040.89703: Set connection var ansible_connection to ssh 9396 1727204040.89705: Set connection var ansible_shell_type to sh 9396 1727204040.89709: variable 'ansible_shell_executable' from source: unknown 9396 1727204040.89712: variable 'ansible_connection' from source: unknown 9396 1727204040.89714: variable 'ansible_module_compression' from source: unknown 9396 1727204040.89716: variable 'ansible_shell_type' from source: unknown 9396 1727204040.89719: variable 'ansible_shell_executable' from source: unknown 9396 1727204040.89721: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204040.89722: variable 'ansible_pipelining' from source: unknown 9396 1727204040.89725: variable 'ansible_timeout' from source: unknown 9396 1727204040.89727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204040.89899: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204040.89927: variable 'omit' from source: magic vars 9396 1727204040.89940: starting attempt loop 9396 1727204040.90063: running the handler 9396 1727204040.90068: handler run complete 9396 1727204040.90071: attempt loop complete, returning result 9396 1727204040.90073: _execute() done 9396 1727204040.90076: dumping result to json 9396 1727204040.90078: done dumping result, returning 9396 1727204040.90081: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-36c5-1f9e-000000000027] 9396 1727204040.90087: sending task result for task 12b410aa-8751-36c5-1f9e-000000000027 ok: [managed-node1] => {} MSG: Using network provider: nm 9396 1727204040.90283: no more pending results, returning what we have 9396 1727204040.90288: results queue empty 9396 1727204040.90291: checking for any_errors_fatal 9396 1727204040.90309: done checking for any_errors_fatal 9396 1727204040.90310: checking for max_fail_percentage 9396 1727204040.90312: done checking for max_fail_percentage 9396 1727204040.90313: checking to see if all hosts have failed and the running result is not ok 9396 1727204040.90314: done checking to see if all hosts have failed 9396 1727204040.90315: getting the remaining hosts for this loop 9396 1727204040.90317: done getting the remaining hosts for this loop 9396 1727204040.90322: getting the next task for host managed-node1 9396 1727204040.90328: done getting next task for host managed-node1 9396 1727204040.90334: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 9396 1727204040.90337: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204040.90350: getting variables 9396 1727204040.90352: in VariableManager get_vars() 9396 1727204040.90534: Calling all_inventory to load vars for managed-node1 9396 1727204040.90538: Calling groups_inventory to load vars for managed-node1 9396 1727204040.90540: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204040.90553: Calling all_plugins_play to load vars for managed-node1 9396 1727204040.90556: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204040.90561: Calling groups_plugins_play to load vars for managed-node1 9396 1727204040.91124: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000027 9396 1727204040.91128: WORKER PROCESS EXITING 9396 1727204040.92776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204040.95028: done with get_vars() 9396 1727204040.95061: done getting variables 9396 1727204040.95151: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:00 -0400 (0:00:00.071) 0:00:16.922 ***** 9396 1727204040.95180: entering _queue_task() for managed-node1/fail 9396 1727204040.95181: Creating lock for fail 9396 1727204040.95460: worker is 1 (out of 1 available) 9396 1727204040.95476: exiting _queue_task() for managed-node1/fail 9396 1727204040.95492: done queuing things up, now waiting for results queue to drain 9396 1727204040.95494: waiting for pending results... 9396 1727204040.95686: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 9396 1727204040.95787: in run() - task 12b410aa-8751-36c5-1f9e-000000000028 9396 1727204040.95803: variable 'ansible_search_path' from source: unknown 9396 1727204040.95807: variable 'ansible_search_path' from source: unknown 9396 1727204040.95847: calling self._execute() 9396 1727204040.95920: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204040.95926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204040.95944: variable 'omit' from source: magic vars 9396 1727204040.96271: variable 'ansible_distribution_major_version' from source: facts 9396 1727204040.96287: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204040.96430: variable 'network_state' from source: role '' defaults 9396 1727204040.96595: Evaluated conditional (network_state != {}): False 9396 1727204040.96599: when evaluation is False, skipping this task 9396 1727204040.96602: _execute() done 9396 1727204040.96604: dumping result to json 9396 1727204040.96607: done dumping result, returning 9396 1727204040.96609: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-36c5-1f9e-000000000028] 9396 1727204040.96612: sending task result for task 12b410aa-8751-36c5-1f9e-000000000028 9396 1727204040.96694: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000028 9396 1727204040.96697: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 9396 1727204040.96845: no more pending results, returning what we have 9396 1727204040.96848: results queue empty 9396 1727204040.96849: checking for any_errors_fatal 9396 1727204040.96854: done checking for any_errors_fatal 9396 1727204040.96855: checking for max_fail_percentage 9396 1727204040.96857: done checking for max_fail_percentage 9396 1727204040.96858: checking to see if all hosts have failed and the running result is not ok 9396 1727204040.96859: done checking to see if all hosts have failed 9396 1727204040.96860: getting the remaining hosts for this loop 9396 1727204040.96861: done getting the remaining hosts for this loop 9396 1727204040.96865: getting the next task for host managed-node1 9396 1727204040.96870: done getting next task for host managed-node1 9396 1727204040.96874: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 9396 1727204040.96877: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204040.96896: getting variables 9396 1727204040.96898: in VariableManager get_vars() 9396 1727204040.96948: Calling all_inventory to load vars for managed-node1 9396 1727204040.96951: Calling groups_inventory to load vars for managed-node1 9396 1727204040.96954: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204040.96965: Calling all_plugins_play to load vars for managed-node1 9396 1727204040.96968: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204040.96972: Calling groups_plugins_play to load vars for managed-node1 9396 1727204040.98692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204041.00872: done with get_vars() 9396 1727204041.00914: done getting variables 9396 1727204041.00991: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.058) 0:00:16.981 ***** 9396 1727204041.01031: entering _queue_task() for managed-node1/fail 9396 1727204041.01372: worker is 1 (out of 1 available) 9396 1727204041.01387: exiting _queue_task() for managed-node1/fail 9396 1727204041.01403: done queuing things up, now waiting for results queue to drain 9396 1727204041.01406: waiting for pending results... 9396 1727204041.01709: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 9396 1727204041.01826: in run() - task 12b410aa-8751-36c5-1f9e-000000000029 9396 1727204041.01839: variable 'ansible_search_path' from source: unknown 9396 1727204041.01842: variable 'ansible_search_path' from source: unknown 9396 1727204041.01881: calling self._execute() 9396 1727204041.01956: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204041.01964: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204041.01974: variable 'omit' from source: magic vars 9396 1727204041.02285: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.02299: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204041.02405: variable 'network_state' from source: role '' defaults 9396 1727204041.02417: Evaluated conditional (network_state != {}): False 9396 1727204041.02423: when evaluation is False, skipping this task 9396 1727204041.02426: _execute() done 9396 1727204041.02429: dumping result to json 9396 1727204041.02433: done dumping result, returning 9396 1727204041.02445: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-36c5-1f9e-000000000029] 9396 1727204041.02449: sending task result for task 12b410aa-8751-36c5-1f9e-000000000029 9396 1727204041.02546: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000029 9396 1727204041.02549: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 9396 1727204041.02606: no more pending results, returning what we have 9396 1727204041.02613: results queue empty 9396 1727204041.02615: checking for any_errors_fatal 9396 1727204041.02625: done checking for any_errors_fatal 9396 1727204041.02626: checking for max_fail_percentage 9396 1727204041.02628: done checking for max_fail_percentage 9396 1727204041.02629: checking to see if all hosts have failed and the running result is not ok 9396 1727204041.02630: done checking to see if all hosts have failed 9396 1727204041.02631: getting the remaining hosts for this loop 9396 1727204041.02632: done getting the remaining hosts for this loop 9396 1727204041.02637: getting the next task for host managed-node1 9396 1727204041.02643: done getting next task for host managed-node1 9396 1727204041.02647: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 9396 1727204041.02650: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204041.02676: getting variables 9396 1727204041.02678: in VariableManager get_vars() 9396 1727204041.02720: Calling all_inventory to load vars for managed-node1 9396 1727204041.02723: Calling groups_inventory to load vars for managed-node1 9396 1727204041.02726: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204041.02736: Calling all_plugins_play to load vars for managed-node1 9396 1727204041.02740: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204041.02743: Calling groups_plugins_play to load vars for managed-node1 9396 1727204041.04367: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204041.06061: done with get_vars() 9396 1727204041.06090: done getting variables 9396 1727204041.06143: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.051) 0:00:17.032 ***** 9396 1727204041.06172: entering _queue_task() for managed-node1/fail 9396 1727204041.06434: worker is 1 (out of 1 available) 9396 1727204041.06448: exiting _queue_task() for managed-node1/fail 9396 1727204041.06462: done queuing things up, now waiting for results queue to drain 9396 1727204041.06464: waiting for pending results... 9396 1727204041.06664: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 9396 1727204041.06774: in run() - task 12b410aa-8751-36c5-1f9e-00000000002a 9396 1727204041.06787: variable 'ansible_search_path' from source: unknown 9396 1727204041.06793: variable 'ansible_search_path' from source: unknown 9396 1727204041.06830: calling self._execute() 9396 1727204041.06902: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204041.06909: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204041.06924: variable 'omit' from source: magic vars 9396 1727204041.07239: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.07250: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204041.07407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204041.09170: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204041.09227: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204041.09260: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204041.09290: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204041.09318: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204041.09388: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.09417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.09442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.09477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.09491: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.09574: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.09587: Evaluated conditional (ansible_distribution_major_version | int > 9): True 9396 1727204041.09688: variable 'ansible_distribution' from source: facts 9396 1727204041.09694: variable '__network_rh_distros' from source: role '' defaults 9396 1727204041.09703: Evaluated conditional (ansible_distribution in __network_rh_distros): False 9396 1727204041.09706: when evaluation is False, skipping this task 9396 1727204041.09713: _execute() done 9396 1727204041.09718: dumping result to json 9396 1727204041.09722: done dumping result, returning 9396 1727204041.09731: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-36c5-1f9e-00000000002a] 9396 1727204041.09739: sending task result for task 12b410aa-8751-36c5-1f9e-00000000002a 9396 1727204041.09833: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000002a 9396 1727204041.09836: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 9396 1727204041.09896: no more pending results, returning what we have 9396 1727204041.09900: results queue empty 9396 1727204041.09902: checking for any_errors_fatal 9396 1727204041.09909: done checking for any_errors_fatal 9396 1727204041.09910: checking for max_fail_percentage 9396 1727204041.09912: done checking for max_fail_percentage 9396 1727204041.09913: checking to see if all hosts have failed and the running result is not ok 9396 1727204041.09914: done checking to see if all hosts have failed 9396 1727204041.09914: getting the remaining hosts for this loop 9396 1727204041.09916: done getting the remaining hosts for this loop 9396 1727204041.09921: getting the next task for host managed-node1 9396 1727204041.09928: done getting next task for host managed-node1 9396 1727204041.09932: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 9396 1727204041.09936: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204041.09953: getting variables 9396 1727204041.09955: in VariableManager get_vars() 9396 1727204041.09997: Calling all_inventory to load vars for managed-node1 9396 1727204041.10001: Calling groups_inventory to load vars for managed-node1 9396 1727204041.10004: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204041.10016: Calling all_plugins_play to load vars for managed-node1 9396 1727204041.10019: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204041.10023: Calling groups_plugins_play to load vars for managed-node1 9396 1727204041.11234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204041.12801: done with get_vars() 9396 1727204041.12826: done getting variables 9396 1727204041.12908: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.067) 0:00:17.100 ***** 9396 1727204041.12936: entering _queue_task() for managed-node1/dnf 9396 1727204041.13182: worker is 1 (out of 1 available) 9396 1727204041.13199: exiting _queue_task() for managed-node1/dnf 9396 1727204041.13212: done queuing things up, now waiting for results queue to drain 9396 1727204041.13215: waiting for pending results... 9396 1727204041.13408: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 9396 1727204041.13507: in run() - task 12b410aa-8751-36c5-1f9e-00000000002b 9396 1727204041.13524: variable 'ansible_search_path' from source: unknown 9396 1727204041.13528: variable 'ansible_search_path' from source: unknown 9396 1727204041.13566: calling self._execute() 9396 1727204041.13638: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204041.13645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204041.13663: variable 'omit' from source: magic vars 9396 1727204041.13967: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.13978: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204041.14158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204041.16192: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204041.16247: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204041.16278: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204041.16321: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204041.16345: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204041.16422: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.16446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.16467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.16507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.16520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.16632: variable 'ansible_distribution' from source: facts 9396 1727204041.16636: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.16642: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 9396 1727204041.16744: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204041.16860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.16880: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.16903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.16939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.16959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.16991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.17012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.17033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.17068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.17084: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.17122: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.17142: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.17167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.17204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.17217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.17348: variable 'network_connections' from source: task vars 9396 1727204041.17360: variable 'controller_profile' from source: play vars 9396 1727204041.17420: variable 'controller_profile' from source: play vars 9396 1727204041.17429: variable 'controller_device' from source: play vars 9396 1727204041.17479: variable 'controller_device' from source: play vars 9396 1727204041.17493: variable 'port1_profile' from source: play vars 9396 1727204041.17549: variable 'port1_profile' from source: play vars 9396 1727204041.17556: variable 'dhcp_interface1' from source: play vars 9396 1727204041.17616: variable 'dhcp_interface1' from source: play vars 9396 1727204041.17623: variable 'controller_profile' from source: play vars 9396 1727204041.17672: variable 'controller_profile' from source: play vars 9396 1727204041.17680: variable 'port2_profile' from source: play vars 9396 1727204041.17737: variable 'port2_profile' from source: play vars 9396 1727204041.17744: variable 'dhcp_interface2' from source: play vars 9396 1727204041.17794: variable 'dhcp_interface2' from source: play vars 9396 1727204041.17807: variable 'controller_profile' from source: play vars 9396 1727204041.17859: variable 'controller_profile' from source: play vars 9396 1727204041.17927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204041.18081: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204041.18117: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204041.18146: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204041.18175: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204041.18215: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204041.18234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204041.18266: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.18284: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204041.18341: variable '__network_team_connections_defined' from source: role '' defaults 9396 1727204041.18549: variable 'network_connections' from source: task vars 9396 1727204041.18553: variable 'controller_profile' from source: play vars 9396 1727204041.18610: variable 'controller_profile' from source: play vars 9396 1727204041.18619: variable 'controller_device' from source: play vars 9396 1727204041.18669: variable 'controller_device' from source: play vars 9396 1727204041.18679: variable 'port1_profile' from source: play vars 9396 1727204041.18736: variable 'port1_profile' from source: play vars 9396 1727204041.18743: variable 'dhcp_interface1' from source: play vars 9396 1727204041.18795: variable 'dhcp_interface1' from source: play vars 9396 1727204041.18802: variable 'controller_profile' from source: play vars 9396 1727204041.18855: variable 'controller_profile' from source: play vars 9396 1727204041.18862: variable 'port2_profile' from source: play vars 9396 1727204041.18924: variable 'port2_profile' from source: play vars 9396 1727204041.18928: variable 'dhcp_interface2' from source: play vars 9396 1727204041.18975: variable 'dhcp_interface2' from source: play vars 9396 1727204041.18982: variable 'controller_profile' from source: play vars 9396 1727204041.19039: variable 'controller_profile' from source: play vars 9396 1727204041.19066: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 9396 1727204041.19070: when evaluation is False, skipping this task 9396 1727204041.19073: _execute() done 9396 1727204041.19078: dumping result to json 9396 1727204041.19082: done dumping result, returning 9396 1727204041.19092: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-36c5-1f9e-00000000002b] 9396 1727204041.19099: sending task result for task 12b410aa-8751-36c5-1f9e-00000000002b 9396 1727204041.19208: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000002b 9396 1727204041.19211: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 9396 1727204041.19284: no more pending results, returning what we have 9396 1727204041.19291: results queue empty 9396 1727204041.19292: checking for any_errors_fatal 9396 1727204041.19297: done checking for any_errors_fatal 9396 1727204041.19298: checking for max_fail_percentage 9396 1727204041.19300: done checking for max_fail_percentage 9396 1727204041.19301: checking to see if all hosts have failed and the running result is not ok 9396 1727204041.19302: done checking to see if all hosts have failed 9396 1727204041.19303: getting the remaining hosts for this loop 9396 1727204041.19305: done getting the remaining hosts for this loop 9396 1727204041.19310: getting the next task for host managed-node1 9396 1727204041.19316: done getting next task for host managed-node1 9396 1727204041.19321: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 9396 1727204041.19324: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204041.19341: getting variables 9396 1727204041.19343: in VariableManager get_vars() 9396 1727204041.19384: Calling all_inventory to load vars for managed-node1 9396 1727204041.19387: Calling groups_inventory to load vars for managed-node1 9396 1727204041.19400: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204041.19411: Calling all_plugins_play to load vars for managed-node1 9396 1727204041.19414: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204041.19418: Calling groups_plugins_play to load vars for managed-node1 9396 1727204041.20765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204041.22354: done with get_vars() 9396 1727204041.22380: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 9396 1727204041.22452: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.095) 0:00:17.195 ***** 9396 1727204041.22479: entering _queue_task() for managed-node1/yum 9396 1727204041.22480: Creating lock for yum 9396 1727204041.22763: worker is 1 (out of 1 available) 9396 1727204041.22781: exiting _queue_task() for managed-node1/yum 9396 1727204041.22796: done queuing things up, now waiting for results queue to drain 9396 1727204041.22798: waiting for pending results... 9396 1727204041.22981: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 9396 1727204041.23081: in run() - task 12b410aa-8751-36c5-1f9e-00000000002c 9396 1727204041.23095: variable 'ansible_search_path' from source: unknown 9396 1727204041.23098: variable 'ansible_search_path' from source: unknown 9396 1727204041.23140: calling self._execute() 9396 1727204041.23206: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204041.23213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204041.23222: variable 'omit' from source: magic vars 9396 1727204041.23535: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.23547: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204041.23717: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204041.25695: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204041.25699: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204041.25702: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204041.25704: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204041.25707: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204041.25791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.25834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.25872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.25931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.25958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.26076: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.26113: Evaluated conditional (ansible_distribution_major_version | int < 8): False 9396 1727204041.26122: when evaluation is False, skipping this task 9396 1727204041.26131: _execute() done 9396 1727204041.26139: dumping result to json 9396 1727204041.26149: done dumping result, returning 9396 1727204041.26161: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-36c5-1f9e-00000000002c] 9396 1727204041.26174: sending task result for task 12b410aa-8751-36c5-1f9e-00000000002c skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 9396 1727204041.26345: no more pending results, returning what we have 9396 1727204041.26350: results queue empty 9396 1727204041.26352: checking for any_errors_fatal 9396 1727204041.26358: done checking for any_errors_fatal 9396 1727204041.26359: checking for max_fail_percentage 9396 1727204041.26361: done checking for max_fail_percentage 9396 1727204041.26362: checking to see if all hosts have failed and the running result is not ok 9396 1727204041.26363: done checking to see if all hosts have failed 9396 1727204041.26364: getting the remaining hosts for this loop 9396 1727204041.26366: done getting the remaining hosts for this loop 9396 1727204041.26371: getting the next task for host managed-node1 9396 1727204041.26379: done getting next task for host managed-node1 9396 1727204041.26383: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 9396 1727204041.26387: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204041.26406: getting variables 9396 1727204041.26408: in VariableManager get_vars() 9396 1727204041.26456: Calling all_inventory to load vars for managed-node1 9396 1727204041.26460: Calling groups_inventory to load vars for managed-node1 9396 1727204041.26463: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204041.26476: Calling all_plugins_play to load vars for managed-node1 9396 1727204041.26480: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204041.26485: Calling groups_plugins_play to load vars for managed-node1 9396 1727204041.27243: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000002c 9396 1727204041.27248: WORKER PROCESS EXITING 9396 1727204041.28115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204041.30342: done with get_vars() 9396 1727204041.30368: done getting variables 9396 1727204041.30429: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.079) 0:00:17.275 ***** 9396 1727204041.30458: entering _queue_task() for managed-node1/fail 9396 1727204041.30728: worker is 1 (out of 1 available) 9396 1727204041.30743: exiting _queue_task() for managed-node1/fail 9396 1727204041.30756: done queuing things up, now waiting for results queue to drain 9396 1727204041.30758: waiting for pending results... 9396 1727204041.30945: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 9396 1727204041.31044: in run() - task 12b410aa-8751-36c5-1f9e-00000000002d 9396 1727204041.31057: variable 'ansible_search_path' from source: unknown 9396 1727204041.31061: variable 'ansible_search_path' from source: unknown 9396 1727204041.31096: calling self._execute() 9396 1727204041.31172: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204041.31179: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204041.31188: variable 'omit' from source: magic vars 9396 1727204041.31500: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.31535: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204041.31617: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204041.31858: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204041.34764: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204041.34872: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204041.34937: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204041.34982: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204041.35033: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204041.35147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.35187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.35239: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.35298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.35337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.35416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.35460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.35499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.35570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.35597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.35695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.35711: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.35753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.35842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.35849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.36124: variable 'network_connections' from source: task vars 9396 1727204041.36164: variable 'controller_profile' from source: play vars 9396 1727204041.36260: variable 'controller_profile' from source: play vars 9396 1727204041.36317: variable 'controller_device' from source: play vars 9396 1727204041.36382: variable 'controller_device' from source: play vars 9396 1727204041.36404: variable 'port1_profile' from source: play vars 9396 1727204041.36499: variable 'port1_profile' from source: play vars 9396 1727204041.36518: variable 'dhcp_interface1' from source: play vars 9396 1727204041.36644: variable 'dhcp_interface1' from source: play vars 9396 1727204041.36649: variable 'controller_profile' from source: play vars 9396 1727204041.36717: variable 'controller_profile' from source: play vars 9396 1727204041.36730: variable 'port2_profile' from source: play vars 9396 1727204041.36862: variable 'port2_profile' from source: play vars 9396 1727204041.36865: variable 'dhcp_interface2' from source: play vars 9396 1727204041.36925: variable 'dhcp_interface2' from source: play vars 9396 1727204041.36938: variable 'controller_profile' from source: play vars 9396 1727204041.37023: variable 'controller_profile' from source: play vars 9396 1727204041.37130: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204041.37392: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204041.37468: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204041.37501: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204041.37695: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204041.37698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204041.37701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204041.37703: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.37705: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204041.37787: variable '__network_team_connections_defined' from source: role '' defaults 9396 1727204041.38133: variable 'network_connections' from source: task vars 9396 1727204041.38145: variable 'controller_profile' from source: play vars 9396 1727204041.38235: variable 'controller_profile' from source: play vars 9396 1727204041.38249: variable 'controller_device' from source: play vars 9396 1727204041.38383: variable 'controller_device' from source: play vars 9396 1727204041.38387: variable 'port1_profile' from source: play vars 9396 1727204041.38447: variable 'port1_profile' from source: play vars 9396 1727204041.38461: variable 'dhcp_interface1' from source: play vars 9396 1727204041.38549: variable 'dhcp_interface1' from source: play vars 9396 1727204041.38563: variable 'controller_profile' from source: play vars 9396 1727204041.38648: variable 'controller_profile' from source: play vars 9396 1727204041.38664: variable 'port2_profile' from source: play vars 9396 1727204041.38794: variable 'port2_profile' from source: play vars 9396 1727204041.38798: variable 'dhcp_interface2' from source: play vars 9396 1727204041.38856: variable 'dhcp_interface2' from source: play vars 9396 1727204041.38870: variable 'controller_profile' from source: play vars 9396 1727204041.38966: variable 'controller_profile' from source: play vars 9396 1727204041.39019: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 9396 1727204041.39037: when evaluation is False, skipping this task 9396 1727204041.39052: _execute() done 9396 1727204041.39063: dumping result to json 9396 1727204041.39147: done dumping result, returning 9396 1727204041.39152: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-36c5-1f9e-00000000002d] 9396 1727204041.39155: sending task result for task 12b410aa-8751-36c5-1f9e-00000000002d 9396 1727204041.39236: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000002d 9396 1727204041.39239: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 9396 1727204041.39314: no more pending results, returning what we have 9396 1727204041.39319: results queue empty 9396 1727204041.39320: checking for any_errors_fatal 9396 1727204041.39326: done checking for any_errors_fatal 9396 1727204041.39327: checking for max_fail_percentage 9396 1727204041.39329: done checking for max_fail_percentage 9396 1727204041.39330: checking to see if all hosts have failed and the running result is not ok 9396 1727204041.39331: done checking to see if all hosts have failed 9396 1727204041.39331: getting the remaining hosts for this loop 9396 1727204041.39333: done getting the remaining hosts for this loop 9396 1727204041.39338: getting the next task for host managed-node1 9396 1727204041.39345: done getting next task for host managed-node1 9396 1727204041.39394: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 9396 1727204041.39398: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204041.39418: getting variables 9396 1727204041.39420: in VariableManager get_vars() 9396 1727204041.39710: Calling all_inventory to load vars for managed-node1 9396 1727204041.39714: Calling groups_inventory to load vars for managed-node1 9396 1727204041.39717: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204041.39728: Calling all_plugins_play to load vars for managed-node1 9396 1727204041.39731: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204041.39734: Calling groups_plugins_play to load vars for managed-node1 9396 1727204041.42074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204041.45496: done with get_vars() 9396 1727204041.45558: done getting variables 9396 1727204041.45654: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.152) 0:00:17.427 ***** 9396 1727204041.45699: entering _queue_task() for managed-node1/package 9396 1727204041.46228: worker is 1 (out of 1 available) 9396 1727204041.46241: exiting _queue_task() for managed-node1/package 9396 1727204041.46255: done queuing things up, now waiting for results queue to drain 9396 1727204041.46258: waiting for pending results... 9396 1727204041.46619: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 9396 1727204041.46667: in run() - task 12b410aa-8751-36c5-1f9e-00000000002e 9396 1727204041.46693: variable 'ansible_search_path' from source: unknown 9396 1727204041.46719: variable 'ansible_search_path' from source: unknown 9396 1727204041.46769: calling self._execute() 9396 1727204041.46883: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204041.46901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204041.46934: variable 'omit' from source: magic vars 9396 1727204041.47442: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.47476: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204041.47777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204041.48237: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204041.48242: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204041.48280: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204041.48344: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204041.48486: variable 'network_packages' from source: role '' defaults 9396 1727204041.48582: variable '__network_provider_setup' from source: role '' defaults 9396 1727204041.48594: variable '__network_service_name_default_nm' from source: role '' defaults 9396 1727204041.48650: variable '__network_service_name_default_nm' from source: role '' defaults 9396 1727204041.48658: variable '__network_packages_default_nm' from source: role '' defaults 9396 1727204041.48721: variable '__network_packages_default_nm' from source: role '' defaults 9396 1727204041.48909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204041.54406: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204041.54466: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204041.54499: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204041.54533: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204041.54554: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204041.54624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.54647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.54669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.54703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.54721: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.54762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.54781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.54804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.54841: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.54854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.55045: variable '__network_packages_default_gobject_packages' from source: role '' defaults 9396 1727204041.55149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.55171: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.55192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.55226: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.55238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.55318: variable 'ansible_python' from source: facts 9396 1727204041.55340: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 9396 1727204041.55415: variable '__network_wpa_supplicant_required' from source: role '' defaults 9396 1727204041.55480: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 9396 1727204041.55591: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.55616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.55637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.55667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.55680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.55739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.55763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.55783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.55823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.55835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.55973: variable 'network_connections' from source: task vars 9396 1727204041.55977: variable 'controller_profile' from source: play vars 9396 1727204041.56064: variable 'controller_profile' from source: play vars 9396 1727204041.56073: variable 'controller_device' from source: play vars 9396 1727204041.56168: variable 'controller_device' from source: play vars 9396 1727204041.56179: variable 'port1_profile' from source: play vars 9396 1727204041.56264: variable 'port1_profile' from source: play vars 9396 1727204041.56273: variable 'dhcp_interface1' from source: play vars 9396 1727204041.56354: variable 'dhcp_interface1' from source: play vars 9396 1727204041.56366: variable 'controller_profile' from source: play vars 9396 1727204041.56443: variable 'controller_profile' from source: play vars 9396 1727204041.56452: variable 'port2_profile' from source: play vars 9396 1727204041.56535: variable 'port2_profile' from source: play vars 9396 1727204041.56544: variable 'dhcp_interface2' from source: play vars 9396 1727204041.56627: variable 'dhcp_interface2' from source: play vars 9396 1727204041.56635: variable 'controller_profile' from source: play vars 9396 1727204041.56718: variable 'controller_profile' from source: play vars 9396 1727204041.56779: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204041.56807: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204041.56832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.56857: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204041.56896: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204041.57301: variable 'network_connections' from source: task vars 9396 1727204041.57305: variable 'controller_profile' from source: play vars 9396 1727204041.57341: variable 'controller_profile' from source: play vars 9396 1727204041.57351: variable 'controller_device' from source: play vars 9396 1727204041.57462: variable 'controller_device' from source: play vars 9396 1727204041.57474: variable 'port1_profile' from source: play vars 9396 1727204041.57629: variable 'port1_profile' from source: play vars 9396 1727204041.57640: variable 'dhcp_interface1' from source: play vars 9396 1727204041.57778: variable 'dhcp_interface1' from source: play vars 9396 1727204041.57787: variable 'controller_profile' from source: play vars 9396 1727204041.57899: variable 'controller_profile' from source: play vars 9396 1727204041.57910: variable 'port2_profile' from source: play vars 9396 1727204041.58018: variable 'port2_profile' from source: play vars 9396 1727204041.58028: variable 'dhcp_interface2' from source: play vars 9396 1727204041.58134: variable 'dhcp_interface2' from source: play vars 9396 1727204041.58143: variable 'controller_profile' from source: play vars 9396 1727204041.58251: variable 'controller_profile' from source: play vars 9396 1727204041.58315: variable '__network_packages_default_wireless' from source: role '' defaults 9396 1727204041.58407: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204041.58766: variable 'network_connections' from source: task vars 9396 1727204041.58772: variable 'controller_profile' from source: play vars 9396 1727204041.58847: variable 'controller_profile' from source: play vars 9396 1727204041.58855: variable 'controller_device' from source: play vars 9396 1727204041.58936: variable 'controller_device' from source: play vars 9396 1727204041.58945: variable 'port1_profile' from source: play vars 9396 1727204041.59014: variable 'port1_profile' from source: play vars 9396 1727204041.59022: variable 'dhcp_interface1' from source: play vars 9396 1727204041.59304: variable 'dhcp_interface1' from source: play vars 9396 1727204041.59314: variable 'controller_profile' from source: play vars 9396 1727204041.59385: variable 'controller_profile' from source: play vars 9396 1727204041.59480: variable 'port2_profile' from source: play vars 9396 1727204041.59697: variable 'port2_profile' from source: play vars 9396 1727204041.59700: variable 'dhcp_interface2' from source: play vars 9396 1727204041.59703: variable 'dhcp_interface2' from source: play vars 9396 1727204041.59705: variable 'controller_profile' from source: play vars 9396 1727204041.59753: variable 'controller_profile' from source: play vars 9396 1727204041.59794: variable '__network_packages_default_team' from source: role '' defaults 9396 1727204041.59907: variable '__network_team_connections_defined' from source: role '' defaults 9396 1727204041.60352: variable 'network_connections' from source: task vars 9396 1727204041.60375: variable 'controller_profile' from source: play vars 9396 1727204041.60462: variable 'controller_profile' from source: play vars 9396 1727204041.60493: variable 'controller_device' from source: play vars 9396 1727204041.60551: variable 'controller_device' from source: play vars 9396 1727204041.60560: variable 'port1_profile' from source: play vars 9396 1727204041.60620: variable 'port1_profile' from source: play vars 9396 1727204041.60630: variable 'dhcp_interface1' from source: play vars 9396 1727204041.60686: variable 'dhcp_interface1' from source: play vars 9396 1727204041.60698: variable 'controller_profile' from source: play vars 9396 1727204041.60754: variable 'controller_profile' from source: play vars 9396 1727204041.60761: variable 'port2_profile' from source: play vars 9396 1727204041.60819: variable 'port2_profile' from source: play vars 9396 1727204041.60827: variable 'dhcp_interface2' from source: play vars 9396 1727204041.60880: variable 'dhcp_interface2' from source: play vars 9396 1727204041.60887: variable 'controller_profile' from source: play vars 9396 1727204041.60947: variable 'controller_profile' from source: play vars 9396 1727204041.61002: variable '__network_service_name_default_initscripts' from source: role '' defaults 9396 1727204041.61056: variable '__network_service_name_default_initscripts' from source: role '' defaults 9396 1727204041.61063: variable '__network_packages_default_initscripts' from source: role '' defaults 9396 1727204041.61116: variable '__network_packages_default_initscripts' from source: role '' defaults 9396 1727204041.61306: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 9396 1727204041.61705: variable 'network_connections' from source: task vars 9396 1727204041.61712: variable 'controller_profile' from source: play vars 9396 1727204041.61763: variable 'controller_profile' from source: play vars 9396 1727204041.61771: variable 'controller_device' from source: play vars 9396 1727204041.61827: variable 'controller_device' from source: play vars 9396 1727204041.61835: variable 'port1_profile' from source: play vars 9396 1727204041.61884: variable 'port1_profile' from source: play vars 9396 1727204041.61893: variable 'dhcp_interface1' from source: play vars 9396 1727204041.61947: variable 'dhcp_interface1' from source: play vars 9396 1727204041.61954: variable 'controller_profile' from source: play vars 9396 1727204041.62005: variable 'controller_profile' from source: play vars 9396 1727204041.62015: variable 'port2_profile' from source: play vars 9396 1727204041.62064: variable 'port2_profile' from source: play vars 9396 1727204041.62071: variable 'dhcp_interface2' from source: play vars 9396 1727204041.62124: variable 'dhcp_interface2' from source: play vars 9396 1727204041.62132: variable 'controller_profile' from source: play vars 9396 1727204041.62180: variable 'controller_profile' from source: play vars 9396 1727204041.62188: variable 'ansible_distribution' from source: facts 9396 1727204041.62194: variable '__network_rh_distros' from source: role '' defaults 9396 1727204041.62201: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.62230: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 9396 1727204041.62373: variable 'ansible_distribution' from source: facts 9396 1727204041.62377: variable '__network_rh_distros' from source: role '' defaults 9396 1727204041.62383: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.62391: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 9396 1727204041.62618: variable 'ansible_distribution' from source: facts 9396 1727204041.62622: variable '__network_rh_distros' from source: role '' defaults 9396 1727204041.62624: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.62655: variable 'network_provider' from source: set_fact 9396 1727204041.62695: variable 'ansible_facts' from source: unknown 9396 1727204041.63864: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 9396 1727204041.63868: when evaluation is False, skipping this task 9396 1727204041.63872: _execute() done 9396 1727204041.63875: dumping result to json 9396 1727204041.63945: done dumping result, returning 9396 1727204041.63949: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-36c5-1f9e-00000000002e] 9396 1727204041.63951: sending task result for task 12b410aa-8751-36c5-1f9e-00000000002e 9396 1727204041.64023: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000002e 9396 1727204041.64026: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 9396 1727204041.64098: no more pending results, returning what we have 9396 1727204041.64102: results queue empty 9396 1727204041.64103: checking for any_errors_fatal 9396 1727204041.64109: done checking for any_errors_fatal 9396 1727204041.64110: checking for max_fail_percentage 9396 1727204041.64112: done checking for max_fail_percentage 9396 1727204041.64113: checking to see if all hosts have failed and the running result is not ok 9396 1727204041.64114: done checking to see if all hosts have failed 9396 1727204041.64115: getting the remaining hosts for this loop 9396 1727204041.64117: done getting the remaining hosts for this loop 9396 1727204041.64121: getting the next task for host managed-node1 9396 1727204041.64127: done getting next task for host managed-node1 9396 1727204041.64132: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 9396 1727204041.64135: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204041.64150: getting variables 9396 1727204041.64157: in VariableManager get_vars() 9396 1727204041.64205: Calling all_inventory to load vars for managed-node1 9396 1727204041.64208: Calling groups_inventory to load vars for managed-node1 9396 1727204041.64211: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204041.64222: Calling all_plugins_play to load vars for managed-node1 9396 1727204041.64225: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204041.64228: Calling groups_plugins_play to load vars for managed-node1 9396 1727204041.70944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204041.73295: done with get_vars() 9396 1727204041.73335: done getting variables 9396 1727204041.73379: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.277) 0:00:17.705 ***** 9396 1727204041.73406: entering _queue_task() for managed-node1/package 9396 1727204041.73698: worker is 1 (out of 1 available) 9396 1727204041.73717: exiting _queue_task() for managed-node1/package 9396 1727204041.73733: done queuing things up, now waiting for results queue to drain 9396 1727204041.73735: waiting for pending results... 9396 1727204041.74009: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 9396 1727204041.74297: in run() - task 12b410aa-8751-36c5-1f9e-00000000002f 9396 1727204041.74300: variable 'ansible_search_path' from source: unknown 9396 1727204041.74303: variable 'ansible_search_path' from source: unknown 9396 1727204041.74305: calling self._execute() 9396 1727204041.74311: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204041.74332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204041.74362: variable 'omit' from source: magic vars 9396 1727204041.74873: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.74900: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204041.75075: variable 'network_state' from source: role '' defaults 9396 1727204041.75115: Evaluated conditional (network_state != {}): False 9396 1727204041.75297: when evaluation is False, skipping this task 9396 1727204041.75303: _execute() done 9396 1727204041.75306: dumping result to json 9396 1727204041.75310: done dumping result, returning 9396 1727204041.75313: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-36c5-1f9e-00000000002f] 9396 1727204041.75316: sending task result for task 12b410aa-8751-36c5-1f9e-00000000002f skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 9396 1727204041.75504: no more pending results, returning what we have 9396 1727204041.75511: results queue empty 9396 1727204041.75513: checking for any_errors_fatal 9396 1727204041.75521: done checking for any_errors_fatal 9396 1727204041.75523: checking for max_fail_percentage 9396 1727204041.75524: done checking for max_fail_percentage 9396 1727204041.75526: checking to see if all hosts have failed and the running result is not ok 9396 1727204041.75527: done checking to see if all hosts have failed 9396 1727204041.75528: getting the remaining hosts for this loop 9396 1727204041.75530: done getting the remaining hosts for this loop 9396 1727204041.75535: getting the next task for host managed-node1 9396 1727204041.75545: done getting next task for host managed-node1 9396 1727204041.75549: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 9396 1727204041.75553: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204041.75573: getting variables 9396 1727204041.75575: in VariableManager get_vars() 9396 1727204041.75634: Calling all_inventory to load vars for managed-node1 9396 1727204041.75638: Calling groups_inventory to load vars for managed-node1 9396 1727204041.75641: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204041.75656: Calling all_plugins_play to load vars for managed-node1 9396 1727204041.75660: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204041.75664: Calling groups_plugins_play to load vars for managed-node1 9396 1727204041.76217: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000002f 9396 1727204041.76221: WORKER PROCESS EXITING 9396 1727204041.78921: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204041.84621: done with get_vars() 9396 1727204041.84673: done getting variables 9396 1727204041.84752: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.113) 0:00:17.818 ***** 9396 1727204041.84801: entering _queue_task() for managed-node1/package 9396 1727204041.85165: worker is 1 (out of 1 available) 9396 1727204041.85182: exiting _queue_task() for managed-node1/package 9396 1727204041.85205: done queuing things up, now waiting for results queue to drain 9396 1727204041.85207: waiting for pending results... 9396 1727204041.85712: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 9396 1727204041.85718: in run() - task 12b410aa-8751-36c5-1f9e-000000000030 9396 1727204041.85723: variable 'ansible_search_path' from source: unknown 9396 1727204041.85726: variable 'ansible_search_path' from source: unknown 9396 1727204041.85810: calling self._execute() 9396 1727204041.86020: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204041.86028: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204041.86040: variable 'omit' from source: magic vars 9396 1727204041.86655: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.86686: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204041.86865: variable 'network_state' from source: role '' defaults 9396 1727204041.86881: Evaluated conditional (network_state != {}): False 9396 1727204041.86884: when evaluation is False, skipping this task 9396 1727204041.86888: _execute() done 9396 1727204041.86895: dumping result to json 9396 1727204041.86901: done dumping result, returning 9396 1727204041.86915: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-36c5-1f9e-000000000030] 9396 1727204041.86919: sending task result for task 12b410aa-8751-36c5-1f9e-000000000030 9396 1727204041.87029: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000030 9396 1727204041.87034: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 9396 1727204041.87094: no more pending results, returning what we have 9396 1727204041.87099: results queue empty 9396 1727204041.87101: checking for any_errors_fatal 9396 1727204041.87109: done checking for any_errors_fatal 9396 1727204041.87110: checking for max_fail_percentage 9396 1727204041.87112: done checking for max_fail_percentage 9396 1727204041.87113: checking to see if all hosts have failed and the running result is not ok 9396 1727204041.87114: done checking to see if all hosts have failed 9396 1727204041.87115: getting the remaining hosts for this loop 9396 1727204041.87117: done getting the remaining hosts for this loop 9396 1727204041.87121: getting the next task for host managed-node1 9396 1727204041.87129: done getting next task for host managed-node1 9396 1727204041.87134: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 9396 1727204041.87138: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204041.87157: getting variables 9396 1727204041.87159: in VariableManager get_vars() 9396 1727204041.87208: Calling all_inventory to load vars for managed-node1 9396 1727204041.87211: Calling groups_inventory to load vars for managed-node1 9396 1727204041.87214: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204041.87224: Calling all_plugins_play to load vars for managed-node1 9396 1727204041.87227: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204041.87231: Calling groups_plugins_play to load vars for managed-node1 9396 1727204041.88455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204041.91318: done with get_vars() 9396 1727204041.91349: done getting variables 9396 1727204041.91444: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:01 -0400 (0:00:00.066) 0:00:17.885 ***** 9396 1727204041.91473: entering _queue_task() for managed-node1/service 9396 1727204041.91475: Creating lock for service 9396 1727204041.91762: worker is 1 (out of 1 available) 9396 1727204041.91778: exiting _queue_task() for managed-node1/service 9396 1727204041.91794: done queuing things up, now waiting for results queue to drain 9396 1727204041.91796: waiting for pending results... 9396 1727204041.91996: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 9396 1727204041.92109: in run() - task 12b410aa-8751-36c5-1f9e-000000000031 9396 1727204041.92128: variable 'ansible_search_path' from source: unknown 9396 1727204041.92134: variable 'ansible_search_path' from source: unknown 9396 1727204041.92160: calling self._execute() 9396 1727204041.92240: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204041.92247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204041.92258: variable 'omit' from source: magic vars 9396 1727204041.92635: variable 'ansible_distribution_major_version' from source: facts 9396 1727204041.92642: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204041.92938: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204041.93686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204041.97602: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204041.97704: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204041.97779: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204041.97840: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204041.97884: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204041.97988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.98036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.98069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.98136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.98159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.98235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.98273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.98314: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.98373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.98458: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.98462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204041.98497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204041.98535: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204041.98598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204041.98623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204041.98911: variable 'network_connections' from source: task vars 9396 1727204041.98914: variable 'controller_profile' from source: play vars 9396 1727204041.98990: variable 'controller_profile' from source: play vars 9396 1727204041.99006: variable 'controller_device' from source: play vars 9396 1727204041.99097: variable 'controller_device' from source: play vars 9396 1727204041.99123: variable 'port1_profile' from source: play vars 9396 1727204041.99216: variable 'port1_profile' from source: play vars 9396 1727204041.99232: variable 'dhcp_interface1' from source: play vars 9396 1727204041.99322: variable 'dhcp_interface1' from source: play vars 9396 1727204041.99340: variable 'controller_profile' from source: play vars 9396 1727204041.99439: variable 'controller_profile' from source: play vars 9396 1727204041.99443: variable 'port2_profile' from source: play vars 9396 1727204041.99547: variable 'port2_profile' from source: play vars 9396 1727204041.99551: variable 'dhcp_interface2' from source: play vars 9396 1727204041.99630: variable 'dhcp_interface2' from source: play vars 9396 1727204041.99644: variable 'controller_profile' from source: play vars 9396 1727204041.99766: variable 'controller_profile' from source: play vars 9396 1727204041.99847: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204042.00109: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204042.00179: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204042.00237: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204042.00316: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204042.00352: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204042.00386: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204042.00434: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204042.00475: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204042.00658: variable '__network_team_connections_defined' from source: role '' defaults 9396 1727204042.00962: variable 'network_connections' from source: task vars 9396 1727204042.00975: variable 'controller_profile' from source: play vars 9396 1727204042.01070: variable 'controller_profile' from source: play vars 9396 1727204042.01084: variable 'controller_device' from source: play vars 9396 1727204042.01176: variable 'controller_device' from source: play vars 9396 1727204042.01197: variable 'port1_profile' from source: play vars 9396 1727204042.01285: variable 'port1_profile' from source: play vars 9396 1727204042.01317: variable 'dhcp_interface1' from source: play vars 9396 1727204042.01393: variable 'dhcp_interface1' from source: play vars 9396 1727204042.01425: variable 'controller_profile' from source: play vars 9396 1727204042.01499: variable 'controller_profile' from source: play vars 9396 1727204042.01534: variable 'port2_profile' from source: play vars 9396 1727204042.01604: variable 'port2_profile' from source: play vars 9396 1727204042.01621: variable 'dhcp_interface2' from source: play vars 9396 1727204042.01762: variable 'dhcp_interface2' from source: play vars 9396 1727204042.01765: variable 'controller_profile' from source: play vars 9396 1727204042.01822: variable 'controller_profile' from source: play vars 9396 1727204042.01872: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 9396 1727204042.01885: when evaluation is False, skipping this task 9396 1727204042.01894: _execute() done 9396 1727204042.01903: dumping result to json 9396 1727204042.02094: done dumping result, returning 9396 1727204042.02098: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-36c5-1f9e-000000000031] 9396 1727204042.02101: sending task result for task 12b410aa-8751-36c5-1f9e-000000000031 9396 1727204042.02177: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000031 9396 1727204042.02181: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 9396 1727204042.02244: no more pending results, returning what we have 9396 1727204042.02249: results queue empty 9396 1727204042.02250: checking for any_errors_fatal 9396 1727204042.02258: done checking for any_errors_fatal 9396 1727204042.02259: checking for max_fail_percentage 9396 1727204042.02261: done checking for max_fail_percentage 9396 1727204042.02262: checking to see if all hosts have failed and the running result is not ok 9396 1727204042.02263: done checking to see if all hosts have failed 9396 1727204042.02264: getting the remaining hosts for this loop 9396 1727204042.02266: done getting the remaining hosts for this loop 9396 1727204042.02271: getting the next task for host managed-node1 9396 1727204042.02279: done getting next task for host managed-node1 9396 1727204042.02285: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 9396 1727204042.02288: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204042.02313: getting variables 9396 1727204042.02315: in VariableManager get_vars() 9396 1727204042.02365: Calling all_inventory to load vars for managed-node1 9396 1727204042.02369: Calling groups_inventory to load vars for managed-node1 9396 1727204042.02372: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204042.02384: Calling all_plugins_play to load vars for managed-node1 9396 1727204042.02388: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204042.02522: Calling groups_plugins_play to load vars for managed-node1 9396 1727204042.05277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204042.08474: done with get_vars() 9396 1727204042.08527: done getting variables 9396 1727204042.08610: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:02 -0400 (0:00:00.171) 0:00:18.057 ***** 9396 1727204042.08654: entering _queue_task() for managed-node1/service 9396 1727204042.09348: worker is 1 (out of 1 available) 9396 1727204042.09363: exiting _queue_task() for managed-node1/service 9396 1727204042.09377: done queuing things up, now waiting for results queue to drain 9396 1727204042.09379: waiting for pending results... 9396 1727204042.09924: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 9396 1727204042.09971: in run() - task 12b410aa-8751-36c5-1f9e-000000000032 9396 1727204042.09998: variable 'ansible_search_path' from source: unknown 9396 1727204042.10018: variable 'ansible_search_path' from source: unknown 9396 1727204042.10065: calling self._execute() 9396 1727204042.10181: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204042.10197: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204042.10233: variable 'omit' from source: magic vars 9396 1727204042.10772: variable 'ansible_distribution_major_version' from source: facts 9396 1727204042.10825: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204042.11122: variable 'network_provider' from source: set_fact 9396 1727204042.11135: variable 'network_state' from source: role '' defaults 9396 1727204042.11158: Evaluated conditional (network_provider == "nm" or network_state != {}): True 9396 1727204042.11172: variable 'omit' from source: magic vars 9396 1727204042.11291: variable 'omit' from source: magic vars 9396 1727204042.11373: variable 'network_service_name' from source: role '' defaults 9396 1727204042.11439: variable 'network_service_name' from source: role '' defaults 9396 1727204042.11593: variable '__network_provider_setup' from source: role '' defaults 9396 1727204042.11630: variable '__network_service_name_default_nm' from source: role '' defaults 9396 1727204042.11753: variable '__network_service_name_default_nm' from source: role '' defaults 9396 1727204042.11756: variable '__network_packages_default_nm' from source: role '' defaults 9396 1727204042.11833: variable '__network_packages_default_nm' from source: role '' defaults 9396 1727204042.12179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204042.18203: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204042.18528: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204042.18564: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204042.18695: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204042.18779: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204042.19002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204042.19078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204042.19185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204042.19249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204042.19697: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204042.19703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204042.19706: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204042.19742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204042.19848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204042.19920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204042.20749: variable '__network_packages_default_gobject_packages' from source: role '' defaults 9396 1727204042.21056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204042.21170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204042.21232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204042.21435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204042.21439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204042.21688: variable 'ansible_python' from source: facts 9396 1727204042.21830: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 9396 1727204042.22083: variable '__network_wpa_supplicant_required' from source: role '' defaults 9396 1727204042.22344: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 9396 1727204042.22786: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204042.22937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204042.23056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204042.23168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204042.23256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204042.23399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204042.23521: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204042.23587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204042.23903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204042.23909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204042.24256: variable 'network_connections' from source: task vars 9396 1727204042.24349: variable 'controller_profile' from source: play vars 9396 1727204042.24563: variable 'controller_profile' from source: play vars 9396 1727204042.24683: variable 'controller_device' from source: play vars 9396 1727204042.24789: variable 'controller_device' from source: play vars 9396 1727204042.24997: variable 'port1_profile' from source: play vars 9396 1727204042.25220: variable 'port1_profile' from source: play vars 9396 1727204042.25239: variable 'dhcp_interface1' from source: play vars 9396 1727204042.25465: variable 'dhcp_interface1' from source: play vars 9396 1727204042.25468: variable 'controller_profile' from source: play vars 9396 1727204042.25579: variable 'controller_profile' from source: play vars 9396 1727204042.25603: variable 'port2_profile' from source: play vars 9396 1727204042.25792: variable 'port2_profile' from source: play vars 9396 1727204042.25815: variable 'dhcp_interface2' from source: play vars 9396 1727204042.25918: variable 'dhcp_interface2' from source: play vars 9396 1727204042.25932: variable 'controller_profile' from source: play vars 9396 1727204042.26021: variable 'controller_profile' from source: play vars 9396 1727204042.26302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204042.27098: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204042.27102: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204042.27295: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204042.27696: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204042.27700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204042.27862: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204042.27945: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204042.28052: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204042.28342: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204042.29274: variable 'network_connections' from source: task vars 9396 1727204042.29431: variable 'controller_profile' from source: play vars 9396 1727204042.29741: variable 'controller_profile' from source: play vars 9396 1727204042.29762: variable 'controller_device' from source: play vars 9396 1727204042.29879: variable 'controller_device' from source: play vars 9396 1727204042.29910: variable 'port1_profile' from source: play vars 9396 1727204042.30012: variable 'port1_profile' from source: play vars 9396 1727204042.30034: variable 'dhcp_interface1' from source: play vars 9396 1727204042.30258: variable 'dhcp_interface1' from source: play vars 9396 1727204042.30262: variable 'controller_profile' from source: play vars 9396 1727204042.30360: variable 'controller_profile' from source: play vars 9396 1727204042.30412: variable 'port2_profile' from source: play vars 9396 1727204042.30567: variable 'port2_profile' from source: play vars 9396 1727204042.30571: variable 'dhcp_interface2' from source: play vars 9396 1727204042.30640: variable 'dhcp_interface2' from source: play vars 9396 1727204042.30684: variable 'controller_profile' from source: play vars 9396 1727204042.30999: variable 'controller_profile' from source: play vars 9396 1727204042.31232: variable '__network_packages_default_wireless' from source: role '' defaults 9396 1727204042.31392: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204042.32198: variable 'network_connections' from source: task vars 9396 1727204042.32202: variable 'controller_profile' from source: play vars 9396 1727204042.32238: variable 'controller_profile' from source: play vars 9396 1727204042.32254: variable 'controller_device' from source: play vars 9396 1727204042.32497: variable 'controller_device' from source: play vars 9396 1727204042.32506: variable 'port1_profile' from source: play vars 9396 1727204042.32601: variable 'port1_profile' from source: play vars 9396 1727204042.32618: variable 'dhcp_interface1' from source: play vars 9396 1727204042.32706: variable 'dhcp_interface1' from source: play vars 9396 1727204042.32756: variable 'controller_profile' from source: play vars 9396 1727204042.32915: variable 'controller_profile' from source: play vars 9396 1727204042.32977: variable 'port2_profile' from source: play vars 9396 1727204042.33231: variable 'port2_profile' from source: play vars 9396 1727204042.33237: variable 'dhcp_interface2' from source: play vars 9396 1727204042.33449: variable 'dhcp_interface2' from source: play vars 9396 1727204042.33453: variable 'controller_profile' from source: play vars 9396 1727204042.33605: variable 'controller_profile' from source: play vars 9396 1727204042.33684: variable '__network_packages_default_team' from source: role '' defaults 9396 1727204042.33873: variable '__network_team_connections_defined' from source: role '' defaults 9396 1727204042.35085: variable 'network_connections' from source: task vars 9396 1727204042.35091: variable 'controller_profile' from source: play vars 9396 1727204042.35220: variable 'controller_profile' from source: play vars 9396 1727204042.35240: variable 'controller_device' from source: play vars 9396 1727204042.35374: variable 'controller_device' from source: play vars 9396 1727204042.35461: variable 'port1_profile' from source: play vars 9396 1727204042.35715: variable 'port1_profile' from source: play vars 9396 1727204042.35775: variable 'dhcp_interface1' from source: play vars 9396 1727204042.35969: variable 'dhcp_interface1' from source: play vars 9396 1727204042.35972: variable 'controller_profile' from source: play vars 9396 1727204042.36292: variable 'controller_profile' from source: play vars 9396 1727204042.36295: variable 'port2_profile' from source: play vars 9396 1727204042.36353: variable 'port2_profile' from source: play vars 9396 1727204042.36414: variable 'dhcp_interface2' from source: play vars 9396 1727204042.36619: variable 'dhcp_interface2' from source: play vars 9396 1727204042.36634: variable 'controller_profile' from source: play vars 9396 1727204042.36904: variable 'controller_profile' from source: play vars 9396 1727204042.37223: variable '__network_service_name_default_initscripts' from source: role '' defaults 9396 1727204042.37326: variable '__network_service_name_default_initscripts' from source: role '' defaults 9396 1727204042.37375: variable '__network_packages_default_initscripts' from source: role '' defaults 9396 1727204042.37631: variable '__network_packages_default_initscripts' from source: role '' defaults 9396 1727204042.38327: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 9396 1727204042.39653: variable 'network_connections' from source: task vars 9396 1727204042.39668: variable 'controller_profile' from source: play vars 9396 1727204042.39763: variable 'controller_profile' from source: play vars 9396 1727204042.39777: variable 'controller_device' from source: play vars 9396 1727204042.39869: variable 'controller_device' from source: play vars 9396 1727204042.39886: variable 'port1_profile' from source: play vars 9396 1727204042.39975: variable 'port1_profile' from source: play vars 9396 1727204042.39992: variable 'dhcp_interface1' from source: play vars 9396 1727204042.40114: variable 'dhcp_interface1' from source: play vars 9396 1727204042.40128: variable 'controller_profile' from source: play vars 9396 1727204042.40217: variable 'controller_profile' from source: play vars 9396 1727204042.40232: variable 'port2_profile' from source: play vars 9396 1727204042.40322: variable 'port2_profile' from source: play vars 9396 1727204042.40344: variable 'dhcp_interface2' from source: play vars 9396 1727204042.40429: variable 'dhcp_interface2' from source: play vars 9396 1727204042.40450: variable 'controller_profile' from source: play vars 9396 1727204042.40543: variable 'controller_profile' from source: play vars 9396 1727204042.40559: variable 'ansible_distribution' from source: facts 9396 1727204042.40569: variable '__network_rh_distros' from source: role '' defaults 9396 1727204042.40593: variable 'ansible_distribution_major_version' from source: facts 9396 1727204042.40632: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 9396 1727204042.40947: variable 'ansible_distribution' from source: facts 9396 1727204042.40950: variable '__network_rh_distros' from source: role '' defaults 9396 1727204042.40953: variable 'ansible_distribution_major_version' from source: facts 9396 1727204042.40955: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 9396 1727204042.41213: variable 'ansible_distribution' from source: facts 9396 1727204042.41223: variable '__network_rh_distros' from source: role '' defaults 9396 1727204042.41233: variable 'ansible_distribution_major_version' from source: facts 9396 1727204042.41284: variable 'network_provider' from source: set_fact 9396 1727204042.41326: variable 'omit' from source: magic vars 9396 1727204042.41698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204042.41702: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204042.41705: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204042.41709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204042.41712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204042.41744: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204042.41753: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204042.41762: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204042.41958: Set connection var ansible_timeout to 10 9396 1727204042.41972: Set connection var ansible_shell_executable to /bin/sh 9396 1727204042.41987: Set connection var ansible_pipelining to False 9396 1727204042.42001: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204042.42015: Set connection var ansible_connection to ssh 9396 1727204042.42030: Set connection var ansible_shell_type to sh 9396 1727204042.42067: variable 'ansible_shell_executable' from source: unknown 9396 1727204042.42075: variable 'ansible_connection' from source: unknown 9396 1727204042.42083: variable 'ansible_module_compression' from source: unknown 9396 1727204042.42093: variable 'ansible_shell_type' from source: unknown 9396 1727204042.42100: variable 'ansible_shell_executable' from source: unknown 9396 1727204042.42109: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204042.42119: variable 'ansible_pipelining' from source: unknown 9396 1727204042.42131: variable 'ansible_timeout' from source: unknown 9396 1727204042.42144: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204042.42294: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204042.42356: variable 'omit' from source: magic vars 9396 1727204042.42373: starting attempt loop 9396 1727204042.42459: running the handler 9396 1727204042.42499: variable 'ansible_facts' from source: unknown 9396 1727204042.44681: _low_level_execute_command(): starting 9396 1727204042.44702: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204042.45530: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 9396 1727204042.45550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204042.45636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204042.45677: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204042.45696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204042.45753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204042.45811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204042.47846: stdout chunk (state=3): >>>/root <<< 9396 1727204042.48096: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204042.48100: stdout chunk (state=3): >>><<< 9396 1727204042.48103: stderr chunk (state=3): >>><<< 9396 1727204042.48109: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204042.48118: _low_level_execute_command(): starting 9396 1727204042.48121: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891 `" && echo ansible-tmp-1727204042.4799194-10997-28914524822891="` echo /root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891 `" ) && sleep 0' 9396 1727204042.48930: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204042.48936: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204042.48955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204042.49035: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204042.51330: stdout chunk (state=3): >>>ansible-tmp-1727204042.4799194-10997-28914524822891=/root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891 <<< 9396 1727204042.51514: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204042.51518: stdout chunk (state=3): >>><<< 9396 1727204042.51520: stderr chunk (state=3): >>><<< 9396 1727204042.51537: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204042.4799194-10997-28914524822891=/root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204042.51639: variable 'ansible_module_compression' from source: unknown 9396 1727204042.51897: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 9396 1727204042.51901: ANSIBALLZ: Acquiring lock 9396 1727204042.51904: ANSIBALLZ: Lock acquired: 139797141880704 9396 1727204042.51906: ANSIBALLZ: Creating module 9396 1727204042.95443: ANSIBALLZ: Writing module into payload 9396 1727204042.95660: ANSIBALLZ: Writing module 9396 1727204042.95702: ANSIBALLZ: Renaming module 9396 1727204042.95711: ANSIBALLZ: Done creating module 9396 1727204042.95751: variable 'ansible_facts' from source: unknown 9396 1727204042.95965: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891/AnsiballZ_systemd.py 9396 1727204042.96117: Sending initial data 9396 1727204042.96120: Sent initial data (154 bytes) 9396 1727204042.96975: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204042.97022: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204042.97086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204042.97094: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204042.97152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204042.98926: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204042.98985: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204042.99039: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmppyhpqvgg /root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891/AnsiballZ_systemd.py <<< 9396 1727204042.99044: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891/AnsiballZ_systemd.py" <<< 9396 1727204042.99096: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmppyhpqvgg" to remote "/root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891/AnsiballZ_systemd.py" <<< 9396 1727204043.01811: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204043.02001: stderr chunk (state=3): >>><<< 9396 1727204043.02005: stdout chunk (state=3): >>><<< 9396 1727204043.02010: done transferring module to remote 9396 1727204043.02013: _low_level_execute_command(): starting 9396 1727204043.02015: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891/ /root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891/AnsiballZ_systemd.py && sleep 0' 9396 1727204043.02609: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204043.02620: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204043.02628: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204043.02648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204043.02699: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204043.02703: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204043.02706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204043.02712: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204043.02714: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 9396 1727204043.02717: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 9396 1727204043.02726: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204043.02737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204043.02767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204043.02770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204043.02773: stderr chunk (state=3): >>>debug2: match found <<< 9396 1727204043.02775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204043.02873: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204043.02877: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204043.02900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204043.02959: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204043.05049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204043.05054: stdout chunk (state=3): >>><<< 9396 1727204043.05060: stderr chunk (state=3): >>><<< 9396 1727204043.05233: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204043.05237: _low_level_execute_command(): starting 9396 1727204043.05240: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891/AnsiballZ_systemd.py && sleep 0' 9396 1727204043.05819: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204043.05842: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204043.05858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204043.05880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204043.05904: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204043.05919: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204043.05934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204043.05955: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204043.05971: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 9396 1727204043.05984: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 9396 1727204043.06009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204043.06026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204043.06045: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204043.06105: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204043.06147: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204043.06174: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204043.06412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204043.06550: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204043.40075: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11436032", "MemoryAvailable": "infinity", "CPUUsageNSec": "437959000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "inf<<< 9396 1727204043.40105: stdout chunk (state=3): >>>inity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "lo<<< 9396 1727204043.40119: stdout chunk (state=3): >>>aded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 9396 1727204043.42203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204043.42257: stderr chunk (state=3): >>><<< 9396 1727204043.42264: stdout chunk (state=3): >>><<< 9396 1727204043.42278: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11436032", "MemoryAvailable": "infinity", "CPUUsageNSec": "437959000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204043.42466: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204043.42484: _low_level_execute_command(): starting 9396 1727204043.42491: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204042.4799194-10997-28914524822891/ > /dev/null 2>&1 && sleep 0' 9396 1727204043.42970: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204043.42974: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204043.42976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204043.42979: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204043.42981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204043.43039: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204043.43045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204043.43094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204043.45037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204043.45101: stderr chunk (state=3): >>><<< 9396 1727204043.45105: stdout chunk (state=3): >>><<< 9396 1727204043.45123: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204043.45131: handler run complete 9396 1727204043.45187: attempt loop complete, returning result 9396 1727204043.45192: _execute() done 9396 1727204043.45195: dumping result to json 9396 1727204043.45212: done dumping result, returning 9396 1727204043.45221: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-36c5-1f9e-000000000032] 9396 1727204043.45226: sending task result for task 12b410aa-8751-36c5-1f9e-000000000032 9396 1727204043.45504: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000032 9396 1727204043.45509: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 9396 1727204043.45570: no more pending results, returning what we have 9396 1727204043.45574: results queue empty 9396 1727204043.45575: checking for any_errors_fatal 9396 1727204043.45582: done checking for any_errors_fatal 9396 1727204043.45583: checking for max_fail_percentage 9396 1727204043.45584: done checking for max_fail_percentage 9396 1727204043.45585: checking to see if all hosts have failed and the running result is not ok 9396 1727204043.45586: done checking to see if all hosts have failed 9396 1727204043.45587: getting the remaining hosts for this loop 9396 1727204043.45591: done getting the remaining hosts for this loop 9396 1727204043.45595: getting the next task for host managed-node1 9396 1727204043.45601: done getting next task for host managed-node1 9396 1727204043.45605: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 9396 1727204043.45611: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204043.45631: getting variables 9396 1727204043.45633: in VariableManager get_vars() 9396 1727204043.45672: Calling all_inventory to load vars for managed-node1 9396 1727204043.45675: Calling groups_inventory to load vars for managed-node1 9396 1727204043.45678: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204043.45691: Calling all_plugins_play to load vars for managed-node1 9396 1727204043.45695: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204043.45699: Calling groups_plugins_play to load vars for managed-node1 9396 1727204043.47027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204043.48606: done with get_vars() 9396 1727204043.48635: done getting variables 9396 1727204043.48692: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:03 -0400 (0:00:01.400) 0:00:19.458 ***** 9396 1727204043.48722: entering _queue_task() for managed-node1/service 9396 1727204043.49001: worker is 1 (out of 1 available) 9396 1727204043.49016: exiting _queue_task() for managed-node1/service 9396 1727204043.49031: done queuing things up, now waiting for results queue to drain 9396 1727204043.49033: waiting for pending results... 9396 1727204043.49235: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 9396 1727204043.49336: in run() - task 12b410aa-8751-36c5-1f9e-000000000033 9396 1727204043.49350: variable 'ansible_search_path' from source: unknown 9396 1727204043.49353: variable 'ansible_search_path' from source: unknown 9396 1727204043.49392: calling self._execute() 9396 1727204043.49470: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204043.49475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204043.49490: variable 'omit' from source: magic vars 9396 1727204043.49821: variable 'ansible_distribution_major_version' from source: facts 9396 1727204043.49829: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204043.49932: variable 'network_provider' from source: set_fact 9396 1727204043.49938: Evaluated conditional (network_provider == "nm"): True 9396 1727204043.50019: variable '__network_wpa_supplicant_required' from source: role '' defaults 9396 1727204043.50096: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 9396 1727204043.50254: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204043.51971: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204043.52033: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204043.52065: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204043.52099: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204043.52126: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204043.52210: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204043.52240: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204043.52261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204043.52296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204043.52312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204043.52357: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204043.52377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204043.52399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204043.52436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204043.52450: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204043.52485: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204043.52508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204043.52531: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204043.52565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204043.52578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204043.52702: variable 'network_connections' from source: task vars 9396 1727204043.52718: variable 'controller_profile' from source: play vars 9396 1727204043.52781: variable 'controller_profile' from source: play vars 9396 1727204043.52792: variable 'controller_device' from source: play vars 9396 1727204043.52846: variable 'controller_device' from source: play vars 9396 1727204043.52855: variable 'port1_profile' from source: play vars 9396 1727204043.52910: variable 'port1_profile' from source: play vars 9396 1727204043.52921: variable 'dhcp_interface1' from source: play vars 9396 1727204043.52972: variable 'dhcp_interface1' from source: play vars 9396 1727204043.52979: variable 'controller_profile' from source: play vars 9396 1727204043.53032: variable 'controller_profile' from source: play vars 9396 1727204043.53040: variable 'port2_profile' from source: play vars 9396 1727204043.53090: variable 'port2_profile' from source: play vars 9396 1727204043.53101: variable 'dhcp_interface2' from source: play vars 9396 1727204043.53154: variable 'dhcp_interface2' from source: play vars 9396 1727204043.53161: variable 'controller_profile' from source: play vars 9396 1727204043.53216: variable 'controller_profile' from source: play vars 9396 1727204043.53280: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204043.53425: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204043.53458: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204043.53486: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204043.53518: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204043.53557: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204043.53575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204043.53597: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204043.53628: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204043.53674: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204043.53895: variable 'network_connections' from source: task vars 9396 1727204043.53901: variable 'controller_profile' from source: play vars 9396 1727204043.53956: variable 'controller_profile' from source: play vars 9396 1727204043.53964: variable 'controller_device' from source: play vars 9396 1727204043.54019: variable 'controller_device' from source: play vars 9396 1727204043.54027: variable 'port1_profile' from source: play vars 9396 1727204043.54080: variable 'port1_profile' from source: play vars 9396 1727204043.54088: variable 'dhcp_interface1' from source: play vars 9396 1727204043.54142: variable 'dhcp_interface1' from source: play vars 9396 1727204043.54149: variable 'controller_profile' from source: play vars 9396 1727204043.54204: variable 'controller_profile' from source: play vars 9396 1727204043.54213: variable 'port2_profile' from source: play vars 9396 1727204043.54263: variable 'port2_profile' from source: play vars 9396 1727204043.54272: variable 'dhcp_interface2' from source: play vars 9396 1727204043.54326: variable 'dhcp_interface2' from source: play vars 9396 1727204043.54333: variable 'controller_profile' from source: play vars 9396 1727204043.54385: variable 'controller_profile' from source: play vars 9396 1727204043.54427: Evaluated conditional (__network_wpa_supplicant_required): False 9396 1727204043.54431: when evaluation is False, skipping this task 9396 1727204043.54434: _execute() done 9396 1727204043.54436: dumping result to json 9396 1727204043.54441: done dumping result, returning 9396 1727204043.54450: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-36c5-1f9e-000000000033] 9396 1727204043.54455: sending task result for task 12b410aa-8751-36c5-1f9e-000000000033 9396 1727204043.54553: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000033 9396 1727204043.54556: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 9396 1727204043.54614: no more pending results, returning what we have 9396 1727204043.54618: results queue empty 9396 1727204043.54619: checking for any_errors_fatal 9396 1727204043.54649: done checking for any_errors_fatal 9396 1727204043.54650: checking for max_fail_percentage 9396 1727204043.54652: done checking for max_fail_percentage 9396 1727204043.54653: checking to see if all hosts have failed and the running result is not ok 9396 1727204043.54654: done checking to see if all hosts have failed 9396 1727204043.54654: getting the remaining hosts for this loop 9396 1727204043.54656: done getting the remaining hosts for this loop 9396 1727204043.54661: getting the next task for host managed-node1 9396 1727204043.54677: done getting next task for host managed-node1 9396 1727204043.54681: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 9396 1727204043.54684: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204043.54704: getting variables 9396 1727204043.54705: in VariableManager get_vars() 9396 1727204043.54748: Calling all_inventory to load vars for managed-node1 9396 1727204043.54751: Calling groups_inventory to load vars for managed-node1 9396 1727204043.54753: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204043.54764: Calling all_plugins_play to load vars for managed-node1 9396 1727204043.54767: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204043.54770: Calling groups_plugins_play to load vars for managed-node1 9396 1727204043.55981: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204043.57543: done with get_vars() 9396 1727204043.57572: done getting variables 9396 1727204043.57630: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:03 -0400 (0:00:00.089) 0:00:19.547 ***** 9396 1727204043.57658: entering _queue_task() for managed-node1/service 9396 1727204043.57931: worker is 1 (out of 1 available) 9396 1727204043.57947: exiting _queue_task() for managed-node1/service 9396 1727204043.57961: done queuing things up, now waiting for results queue to drain 9396 1727204043.57964: waiting for pending results... 9396 1727204043.58154: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 9396 1727204043.58260: in run() - task 12b410aa-8751-36c5-1f9e-000000000034 9396 1727204043.58273: variable 'ansible_search_path' from source: unknown 9396 1727204043.58277: variable 'ansible_search_path' from source: unknown 9396 1727204043.58317: calling self._execute() 9396 1727204043.58396: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204043.58406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204043.58420: variable 'omit' from source: magic vars 9396 1727204043.58749: variable 'ansible_distribution_major_version' from source: facts 9396 1727204043.58757: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204043.58861: variable 'network_provider' from source: set_fact 9396 1727204043.58866: Evaluated conditional (network_provider == "initscripts"): False 9396 1727204043.58871: when evaluation is False, skipping this task 9396 1727204043.58876: _execute() done 9396 1727204043.58881: dumping result to json 9396 1727204043.58886: done dumping result, returning 9396 1727204043.58896: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-36c5-1f9e-000000000034] 9396 1727204043.58903: sending task result for task 12b410aa-8751-36c5-1f9e-000000000034 9396 1727204043.59004: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000034 9396 1727204043.59007: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 9396 1727204043.59056: no more pending results, returning what we have 9396 1727204043.59061: results queue empty 9396 1727204043.59063: checking for any_errors_fatal 9396 1727204043.59074: done checking for any_errors_fatal 9396 1727204043.59075: checking for max_fail_percentage 9396 1727204043.59076: done checking for max_fail_percentage 9396 1727204043.59078: checking to see if all hosts have failed and the running result is not ok 9396 1727204043.59079: done checking to see if all hosts have failed 9396 1727204043.59080: getting the remaining hosts for this loop 9396 1727204043.59081: done getting the remaining hosts for this loop 9396 1727204043.59086: getting the next task for host managed-node1 9396 1727204043.59096: done getting next task for host managed-node1 9396 1727204043.59100: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 9396 1727204043.59103: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204043.59121: getting variables 9396 1727204043.59123: in VariableManager get_vars() 9396 1727204043.59163: Calling all_inventory to load vars for managed-node1 9396 1727204043.59166: Calling groups_inventory to load vars for managed-node1 9396 1727204043.59169: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204043.59179: Calling all_plugins_play to load vars for managed-node1 9396 1727204043.59182: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204043.59185: Calling groups_plugins_play to load vars for managed-node1 9396 1727204043.60499: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204043.62065: done with get_vars() 9396 1727204043.62096: done getting variables 9396 1727204043.62153: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:03 -0400 (0:00:00.045) 0:00:19.592 ***** 9396 1727204043.62182: entering _queue_task() for managed-node1/copy 9396 1727204043.62458: worker is 1 (out of 1 available) 9396 1727204043.62473: exiting _queue_task() for managed-node1/copy 9396 1727204043.62486: done queuing things up, now waiting for results queue to drain 9396 1727204043.62488: waiting for pending results... 9396 1727204043.62687: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 9396 1727204043.62790: in run() - task 12b410aa-8751-36c5-1f9e-000000000035 9396 1727204043.62805: variable 'ansible_search_path' from source: unknown 9396 1727204043.62808: variable 'ansible_search_path' from source: unknown 9396 1727204043.62847: calling self._execute() 9396 1727204043.62924: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204043.62937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204043.62948: variable 'omit' from source: magic vars 9396 1727204043.63278: variable 'ansible_distribution_major_version' from source: facts 9396 1727204043.63290: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204043.63394: variable 'network_provider' from source: set_fact 9396 1727204043.63398: Evaluated conditional (network_provider == "initscripts"): False 9396 1727204043.63403: when evaluation is False, skipping this task 9396 1727204043.63407: _execute() done 9396 1727204043.63414: dumping result to json 9396 1727204043.63418: done dumping result, returning 9396 1727204043.63428: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-36c5-1f9e-000000000035] 9396 1727204043.63433: sending task result for task 12b410aa-8751-36c5-1f9e-000000000035 9396 1727204043.63536: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000035 9396 1727204043.63539: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 9396 1727204043.63596: no more pending results, returning what we have 9396 1727204043.63601: results queue empty 9396 1727204043.63602: checking for any_errors_fatal 9396 1727204043.63609: done checking for any_errors_fatal 9396 1727204043.63610: checking for max_fail_percentage 9396 1727204043.63612: done checking for max_fail_percentage 9396 1727204043.63613: checking to see if all hosts have failed and the running result is not ok 9396 1727204043.63614: done checking to see if all hosts have failed 9396 1727204043.63615: getting the remaining hosts for this loop 9396 1727204043.63616: done getting the remaining hosts for this loop 9396 1727204043.63622: getting the next task for host managed-node1 9396 1727204043.63629: done getting next task for host managed-node1 9396 1727204043.63633: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 9396 1727204043.63637: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204043.63655: getting variables 9396 1727204043.63656: in VariableManager get_vars() 9396 1727204043.63699: Calling all_inventory to load vars for managed-node1 9396 1727204043.63701: Calling groups_inventory to load vars for managed-node1 9396 1727204043.63704: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204043.63715: Calling all_plugins_play to load vars for managed-node1 9396 1727204043.63717: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204043.63721: Calling groups_plugins_play to load vars for managed-node1 9396 1727204043.65007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204043.66561: done with get_vars() 9396 1727204043.66591: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:03 -0400 (0:00:00.044) 0:00:19.637 ***** 9396 1727204043.66667: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 9396 1727204043.66669: Creating lock for fedora.linux_system_roles.network_connections 9396 1727204043.66952: worker is 1 (out of 1 available) 9396 1727204043.66966: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 9396 1727204043.66980: done queuing things up, now waiting for results queue to drain 9396 1727204043.66982: waiting for pending results... 9396 1727204043.67180: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 9396 1727204043.67280: in run() - task 12b410aa-8751-36c5-1f9e-000000000036 9396 1727204043.67294: variable 'ansible_search_path' from source: unknown 9396 1727204043.67298: variable 'ansible_search_path' from source: unknown 9396 1727204043.67335: calling self._execute() 9396 1727204043.67413: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204043.67420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204043.67431: variable 'omit' from source: magic vars 9396 1727204043.67755: variable 'ansible_distribution_major_version' from source: facts 9396 1727204043.67764: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204043.67776: variable 'omit' from source: magic vars 9396 1727204043.67827: variable 'omit' from source: magic vars 9396 1727204043.67979: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204043.69698: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204043.69757: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204043.69791: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204043.69830: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204043.69850: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204043.69923: variable 'network_provider' from source: set_fact 9396 1727204043.70040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204043.70078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204043.70102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204043.70137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204043.70153: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204043.70218: variable 'omit' from source: magic vars 9396 1727204043.70319: variable 'omit' from source: magic vars 9396 1727204043.70407: variable 'network_connections' from source: task vars 9396 1727204043.70421: variable 'controller_profile' from source: play vars 9396 1727204043.70474: variable 'controller_profile' from source: play vars 9396 1727204043.70478: variable 'controller_device' from source: play vars 9396 1727204043.70534: variable 'controller_device' from source: play vars 9396 1727204043.70544: variable 'port1_profile' from source: play vars 9396 1727204043.70599: variable 'port1_profile' from source: play vars 9396 1727204043.70603: variable 'dhcp_interface1' from source: play vars 9396 1727204043.70655: variable 'dhcp_interface1' from source: play vars 9396 1727204043.70662: variable 'controller_profile' from source: play vars 9396 1727204043.70719: variable 'controller_profile' from source: play vars 9396 1727204043.70726: variable 'port2_profile' from source: play vars 9396 1727204043.70775: variable 'port2_profile' from source: play vars 9396 1727204043.70784: variable 'dhcp_interface2' from source: play vars 9396 1727204043.70839: variable 'dhcp_interface2' from source: play vars 9396 1727204043.70846: variable 'controller_profile' from source: play vars 9396 1727204043.70901: variable 'controller_profile' from source: play vars 9396 1727204043.71059: variable 'omit' from source: magic vars 9396 1727204043.71068: variable '__lsr_ansible_managed' from source: task vars 9396 1727204043.71122: variable '__lsr_ansible_managed' from source: task vars 9396 1727204043.71281: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 9396 1727204043.71470: Loaded config def from plugin (lookup/template) 9396 1727204043.71473: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 9396 1727204043.71498: File lookup term: get_ansible_managed.j2 9396 1727204043.71501: variable 'ansible_search_path' from source: unknown 9396 1727204043.71507: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 9396 1727204043.71523: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 9396 1727204043.71537: variable 'ansible_search_path' from source: unknown 9396 1727204043.76948: variable 'ansible_managed' from source: unknown 9396 1727204043.77091: variable 'omit' from source: magic vars 9396 1727204043.77122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204043.77146: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204043.77163: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204043.77179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204043.77191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204043.77222: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204043.77225: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204043.77230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204043.77307: Set connection var ansible_timeout to 10 9396 1727204043.77317: Set connection var ansible_shell_executable to /bin/sh 9396 1727204043.77329: Set connection var ansible_pipelining to False 9396 1727204043.77336: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204043.77347: Set connection var ansible_connection to ssh 9396 1727204043.77350: Set connection var ansible_shell_type to sh 9396 1727204043.77375: variable 'ansible_shell_executable' from source: unknown 9396 1727204043.77379: variable 'ansible_connection' from source: unknown 9396 1727204043.77381: variable 'ansible_module_compression' from source: unknown 9396 1727204043.77384: variable 'ansible_shell_type' from source: unknown 9396 1727204043.77390: variable 'ansible_shell_executable' from source: unknown 9396 1727204043.77393: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204043.77398: variable 'ansible_pipelining' from source: unknown 9396 1727204043.77401: variable 'ansible_timeout' from source: unknown 9396 1727204043.77407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204043.77525: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204043.77535: variable 'omit' from source: magic vars 9396 1727204043.77547: starting attempt loop 9396 1727204043.77550: running the handler 9396 1727204043.77565: _low_level_execute_command(): starting 9396 1727204043.77595: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204043.78123: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204043.78127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204043.78131: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204043.78134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204043.78185: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204043.78188: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204043.78196: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204043.78250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204043.80029: stdout chunk (state=3): >>>/root <<< 9396 1727204043.80125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204043.80193: stderr chunk (state=3): >>><<< 9396 1727204043.80196: stdout chunk (state=3): >>><<< 9396 1727204043.80222: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204043.80234: _low_level_execute_command(): starting 9396 1727204043.80241: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482 `" && echo ansible-tmp-1727204043.8022313-11208-28218938519482="` echo /root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482 `" ) && sleep 0' 9396 1727204043.80732: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204043.80736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204043.80740: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204043.80743: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204043.80803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204043.80810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204043.80812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204043.80851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204043.82894: stdout chunk (state=3): >>>ansible-tmp-1727204043.8022313-11208-28218938519482=/root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482 <<< 9396 1727204043.83007: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204043.83071: stderr chunk (state=3): >>><<< 9396 1727204043.83074: stdout chunk (state=3): >>><<< 9396 1727204043.83094: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204043.8022313-11208-28218938519482=/root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204043.84455: variable 'ansible_module_compression' from source: unknown 9396 1727204043.84459: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 9396 1727204043.84462: ANSIBALLZ: Acquiring lock 9396 1727204043.84464: ANSIBALLZ: Lock acquired: 139797140919728 9396 1727204043.84467: ANSIBALLZ: Creating module 9396 1727204044.04109: ANSIBALLZ: Writing module into payload 9396 1727204044.04448: ANSIBALLZ: Writing module 9396 1727204044.04473: ANSIBALLZ: Renaming module 9396 1727204044.04477: ANSIBALLZ: Done creating module 9396 1727204044.04502: variable 'ansible_facts' from source: unknown 9396 1727204044.04576: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482/AnsiballZ_network_connections.py 9396 1727204044.04705: Sending initial data 9396 1727204044.04709: Sent initial data (166 bytes) 9396 1727204044.05215: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204044.05219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204044.05221: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204044.05224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204044.05276: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204044.05280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204044.05282: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204044.05334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204044.07072: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204044.07114: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204044.07161: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpveqrz_kc /root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482/AnsiballZ_network_connections.py <<< 9396 1727204044.07165: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482/AnsiballZ_network_connections.py" <<< 9396 1727204044.07206: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpveqrz_kc" to remote "/root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482/AnsiballZ_network_connections.py" <<< 9396 1727204044.07209: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482/AnsiballZ_network_connections.py" <<< 9396 1727204044.08330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204044.08410: stderr chunk (state=3): >>><<< 9396 1727204044.08416: stdout chunk (state=3): >>><<< 9396 1727204044.08439: done transferring module to remote 9396 1727204044.08450: _low_level_execute_command(): starting 9396 1727204044.08462: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482/ /root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482/AnsiballZ_network_connections.py && sleep 0' 9396 1727204044.08943: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204044.08946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204044.08949: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204044.08951: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204044.09009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204044.09016: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204044.09056: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204044.10993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204044.11045: stderr chunk (state=3): >>><<< 9396 1727204044.11049: stdout chunk (state=3): >>><<< 9396 1727204044.11070: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204044.11073: _low_level_execute_command(): starting 9396 1727204044.11078: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482/AnsiballZ_network_connections.py && sleep 0' 9396 1727204044.11553: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204044.11560: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204044.11563: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204044.11565: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204044.11568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204044.11622: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204044.11629: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204044.11674: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204044.58674: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7ec0abe6-17a3-4aaf-8c88-591c4827c290\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 720e7a7e-de4c-4117-856c-30dd1c763bd3\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 786a4cc1-8289-45a7-9bfe-8d14a2de36f1\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7ec0abe6-17a3-4aaf-8c88-591c4827c290 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 720e7a7e-de4c-4117-856c-30dd1c763bd3 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 786a4cc1-8289-45a7-9bfe-8d14a2de36f1 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 9396 1727204044.60816: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204044.60872: stderr chunk (state=3): >>><<< 9396 1727204044.60876: stdout chunk (state=3): >>><<< 9396 1727204044.60893: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7ec0abe6-17a3-4aaf-8c88-591c4827c290\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 720e7a7e-de4c-4117-856c-30dd1c763bd3\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 786a4cc1-8289-45a7-9bfe-8d14a2de36f1\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7ec0abe6-17a3-4aaf-8c88-591c4827c290 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 720e7a7e-de4c-4117-856c-30dd1c763bd3 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 786a4cc1-8289-45a7-9bfe-8d14a2de36f1 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "deprecated-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "master": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "master": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204044.60961: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'deprecated-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'master': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'master': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204044.60971: _low_level_execute_command(): starting 9396 1727204044.60977: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204043.8022313-11208-28218938519482/ > /dev/null 2>&1 && sleep 0' 9396 1727204044.61463: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204044.61467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204044.61469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 9396 1727204044.61474: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204044.61477: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204044.61530: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204044.61534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204044.61579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204044.63658: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204044.63705: stderr chunk (state=3): >>><<< 9396 1727204044.63712: stdout chunk (state=3): >>><<< 9396 1727204044.63724: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204044.63736: handler run complete 9396 1727204044.63771: attempt loop complete, returning result 9396 1727204044.63774: _execute() done 9396 1727204044.63777: dumping result to json 9396 1727204044.63785: done dumping result, returning 9396 1727204044.63796: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-36c5-1f9e-000000000036] 9396 1727204044.63802: sending task result for task 12b410aa-8751-36c5-1f9e-000000000036 9396 1727204044.63936: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000036 9396 1727204044.63940: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7ec0abe6-17a3-4aaf-8c88-591c4827c290 [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 720e7a7e-de4c-4117-856c-30dd1c763bd3 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 786a4cc1-8289-45a7-9bfe-8d14a2de36f1 [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7ec0abe6-17a3-4aaf-8c88-591c4827c290 (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 720e7a7e-de4c-4117-856c-30dd1c763bd3 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 786a4cc1-8289-45a7-9bfe-8d14a2de36f1 (not-active) 9396 1727204044.64121: no more pending results, returning what we have 9396 1727204044.64125: results queue empty 9396 1727204044.64127: checking for any_errors_fatal 9396 1727204044.64134: done checking for any_errors_fatal 9396 1727204044.64135: checking for max_fail_percentage 9396 1727204044.64136: done checking for max_fail_percentage 9396 1727204044.64137: checking to see if all hosts have failed and the running result is not ok 9396 1727204044.64138: done checking to see if all hosts have failed 9396 1727204044.64139: getting the remaining hosts for this loop 9396 1727204044.64140: done getting the remaining hosts for this loop 9396 1727204044.64145: getting the next task for host managed-node1 9396 1727204044.64152: done getting next task for host managed-node1 9396 1727204044.64156: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 9396 1727204044.64159: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204044.64170: getting variables 9396 1727204044.64172: in VariableManager get_vars() 9396 1727204044.64225: Calling all_inventory to load vars for managed-node1 9396 1727204044.64229: Calling groups_inventory to load vars for managed-node1 9396 1727204044.64232: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204044.64242: Calling all_plugins_play to load vars for managed-node1 9396 1727204044.64245: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204044.64249: Calling groups_plugins_play to load vars for managed-node1 9396 1727204044.65470: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204044.67053: done with get_vars() 9396 1727204044.67079: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:04 -0400 (0:00:01.004) 0:00:20.642 ***** 9396 1727204044.67160: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 9396 1727204044.67161: Creating lock for fedora.linux_system_roles.network_state 9396 1727204044.67440: worker is 1 (out of 1 available) 9396 1727204044.67455: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 9396 1727204044.67469: done queuing things up, now waiting for results queue to drain 9396 1727204044.67472: waiting for pending results... 9396 1727204044.67667: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 9396 1727204044.67774: in run() - task 12b410aa-8751-36c5-1f9e-000000000037 9396 1727204044.67787: variable 'ansible_search_path' from source: unknown 9396 1727204044.67793: variable 'ansible_search_path' from source: unknown 9396 1727204044.67830: calling self._execute() 9396 1727204044.67903: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204044.67916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204044.67924: variable 'omit' from source: magic vars 9396 1727204044.68249: variable 'ansible_distribution_major_version' from source: facts 9396 1727204044.68263: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204044.68365: variable 'network_state' from source: role '' defaults 9396 1727204044.68382: Evaluated conditional (network_state != {}): False 9396 1727204044.68386: when evaluation is False, skipping this task 9396 1727204044.68391: _execute() done 9396 1727204044.68394: dumping result to json 9396 1727204044.68400: done dumping result, returning 9396 1727204044.68410: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-36c5-1f9e-000000000037] 9396 1727204044.68417: sending task result for task 12b410aa-8751-36c5-1f9e-000000000037 9396 1727204044.68513: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000037 9396 1727204044.68516: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 9396 1727204044.68574: no more pending results, returning what we have 9396 1727204044.68579: results queue empty 9396 1727204044.68580: checking for any_errors_fatal 9396 1727204044.68598: done checking for any_errors_fatal 9396 1727204044.68599: checking for max_fail_percentage 9396 1727204044.68601: done checking for max_fail_percentage 9396 1727204044.68603: checking to see if all hosts have failed and the running result is not ok 9396 1727204044.68604: done checking to see if all hosts have failed 9396 1727204044.68605: getting the remaining hosts for this loop 9396 1727204044.68606: done getting the remaining hosts for this loop 9396 1727204044.68611: getting the next task for host managed-node1 9396 1727204044.68618: done getting next task for host managed-node1 9396 1727204044.68623: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 9396 1727204044.68626: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204044.68644: getting variables 9396 1727204044.68646: in VariableManager get_vars() 9396 1727204044.68686: Calling all_inventory to load vars for managed-node1 9396 1727204044.68691: Calling groups_inventory to load vars for managed-node1 9396 1727204044.68694: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204044.68705: Calling all_plugins_play to load vars for managed-node1 9396 1727204044.68708: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204044.68712: Calling groups_plugins_play to load vars for managed-node1 9396 1727204044.69997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204044.71545: done with get_vars() 9396 1727204044.71567: done getting variables 9396 1727204044.71621: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:04 -0400 (0:00:00.044) 0:00:20.687 ***** 9396 1727204044.71650: entering _queue_task() for managed-node1/debug 9396 1727204044.71908: worker is 1 (out of 1 available) 9396 1727204044.71924: exiting _queue_task() for managed-node1/debug 9396 1727204044.71937: done queuing things up, now waiting for results queue to drain 9396 1727204044.71940: waiting for pending results... 9396 1727204044.72132: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 9396 1727204044.72237: in run() - task 12b410aa-8751-36c5-1f9e-000000000038 9396 1727204044.72251: variable 'ansible_search_path' from source: unknown 9396 1727204044.72254: variable 'ansible_search_path' from source: unknown 9396 1727204044.72293: calling self._execute() 9396 1727204044.72369: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204044.72376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204044.72387: variable 'omit' from source: magic vars 9396 1727204044.72703: variable 'ansible_distribution_major_version' from source: facts 9396 1727204044.72718: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204044.72728: variable 'omit' from source: magic vars 9396 1727204044.72776: variable 'omit' from source: magic vars 9396 1727204044.72808: variable 'omit' from source: magic vars 9396 1727204044.72847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204044.72880: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204044.72899: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204044.72920: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204044.72935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204044.72962: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204044.72965: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204044.72968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204044.73052: Set connection var ansible_timeout to 10 9396 1727204044.73063: Set connection var ansible_shell_executable to /bin/sh 9396 1727204044.73071: Set connection var ansible_pipelining to False 9396 1727204044.73078: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204044.73084: Set connection var ansible_connection to ssh 9396 1727204044.73087: Set connection var ansible_shell_type to sh 9396 1727204044.73114: variable 'ansible_shell_executable' from source: unknown 9396 1727204044.73117: variable 'ansible_connection' from source: unknown 9396 1727204044.73120: variable 'ansible_module_compression' from source: unknown 9396 1727204044.73123: variable 'ansible_shell_type' from source: unknown 9396 1727204044.73125: variable 'ansible_shell_executable' from source: unknown 9396 1727204044.73127: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204044.73134: variable 'ansible_pipelining' from source: unknown 9396 1727204044.73136: variable 'ansible_timeout' from source: unknown 9396 1727204044.73143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204044.73274: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204044.73285: variable 'omit' from source: magic vars 9396 1727204044.73292: starting attempt loop 9396 1727204044.73295: running the handler 9396 1727204044.73403: variable '__network_connections_result' from source: set_fact 9396 1727204044.73458: handler run complete 9396 1727204044.73480: attempt loop complete, returning result 9396 1727204044.73484: _execute() done 9396 1727204044.73487: dumping result to json 9396 1727204044.73490: done dumping result, returning 9396 1727204044.73499: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-36c5-1f9e-000000000038] 9396 1727204044.73505: sending task result for task 12b410aa-8751-36c5-1f9e-000000000038 9396 1727204044.73603: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000038 9396 1727204044.73606: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7ec0abe6-17a3-4aaf-8c88-591c4827c290", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 720e7a7e-de4c-4117-856c-30dd1c763bd3", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 786a4cc1-8289-45a7-9bfe-8d14a2de36f1", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7ec0abe6-17a3-4aaf-8c88-591c4827c290 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 720e7a7e-de4c-4117-856c-30dd1c763bd3 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 786a4cc1-8289-45a7-9bfe-8d14a2de36f1 (not-active)" ] } 9396 1727204044.73679: no more pending results, returning what we have 9396 1727204044.73684: results queue empty 9396 1727204044.73685: checking for any_errors_fatal 9396 1727204044.73694: done checking for any_errors_fatal 9396 1727204044.73695: checking for max_fail_percentage 9396 1727204044.73697: done checking for max_fail_percentage 9396 1727204044.73698: checking to see if all hosts have failed and the running result is not ok 9396 1727204044.73699: done checking to see if all hosts have failed 9396 1727204044.73700: getting the remaining hosts for this loop 9396 1727204044.73701: done getting the remaining hosts for this loop 9396 1727204044.73706: getting the next task for host managed-node1 9396 1727204044.73715: done getting next task for host managed-node1 9396 1727204044.73719: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 9396 1727204044.73722: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204044.73734: getting variables 9396 1727204044.73735: in VariableManager get_vars() 9396 1727204044.73773: Calling all_inventory to load vars for managed-node1 9396 1727204044.73776: Calling groups_inventory to load vars for managed-node1 9396 1727204044.73779: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204044.73795: Calling all_plugins_play to load vars for managed-node1 9396 1727204044.73800: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204044.73804: Calling groups_plugins_play to load vars for managed-node1 9396 1727204044.75070: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204044.76647: done with get_vars() 9396 1727204044.76675: done getting variables 9396 1727204044.76734: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:04 -0400 (0:00:00.051) 0:00:20.738 ***** 9396 1727204044.76768: entering _queue_task() for managed-node1/debug 9396 1727204044.77037: worker is 1 (out of 1 available) 9396 1727204044.77052: exiting _queue_task() for managed-node1/debug 9396 1727204044.77066: done queuing things up, now waiting for results queue to drain 9396 1727204044.77068: waiting for pending results... 9396 1727204044.77263: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 9396 1727204044.77369: in run() - task 12b410aa-8751-36c5-1f9e-000000000039 9396 1727204044.77382: variable 'ansible_search_path' from source: unknown 9396 1727204044.77387: variable 'ansible_search_path' from source: unknown 9396 1727204044.77426: calling self._execute() 9396 1727204044.77498: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204044.77505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204044.77519: variable 'omit' from source: magic vars 9396 1727204044.77831: variable 'ansible_distribution_major_version' from source: facts 9396 1727204044.77846: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204044.77851: variable 'omit' from source: magic vars 9396 1727204044.77899: variable 'omit' from source: magic vars 9396 1727204044.77930: variable 'omit' from source: magic vars 9396 1727204044.77969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204044.78002: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204044.78021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204044.78037: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204044.78050: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204044.78081: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204044.78084: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204044.78087: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204044.78169: Set connection var ansible_timeout to 10 9396 1727204044.78185: Set connection var ansible_shell_executable to /bin/sh 9396 1727204044.78195: Set connection var ansible_pipelining to False 9396 1727204044.78202: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204044.78209: Set connection var ansible_connection to ssh 9396 1727204044.78215: Set connection var ansible_shell_type to sh 9396 1727204044.78239: variable 'ansible_shell_executable' from source: unknown 9396 1727204044.78242: variable 'ansible_connection' from source: unknown 9396 1727204044.78245: variable 'ansible_module_compression' from source: unknown 9396 1727204044.78248: variable 'ansible_shell_type' from source: unknown 9396 1727204044.78253: variable 'ansible_shell_executable' from source: unknown 9396 1727204044.78255: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204044.78261: variable 'ansible_pipelining' from source: unknown 9396 1727204044.78264: variable 'ansible_timeout' from source: unknown 9396 1727204044.78269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204044.78395: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204044.78407: variable 'omit' from source: magic vars 9396 1727204044.78417: starting attempt loop 9396 1727204044.78420: running the handler 9396 1727204044.78461: variable '__network_connections_result' from source: set_fact 9396 1727204044.78533: variable '__network_connections_result' from source: set_fact 9396 1727204044.78681: handler run complete 9396 1727204044.78711: attempt loop complete, returning result 9396 1727204044.78714: _execute() done 9396 1727204044.78717: dumping result to json 9396 1727204044.78730: done dumping result, returning 9396 1727204044.78736: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-36c5-1f9e-000000000039] 9396 1727204044.78744: sending task result for task 12b410aa-8751-36c5-1f9e-000000000039 9396 1727204044.78855: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000039 9396 1727204044.78857: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "deprecated-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "interface_name": "test1", "master": "bond0", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "interface_name": "test2", "master": "bond0", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7ec0abe6-17a3-4aaf-8c88-591c4827c290\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 720e7a7e-de4c-4117-856c-30dd1c763bd3\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 786a4cc1-8289-45a7-9bfe-8d14a2de36f1\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7ec0abe6-17a3-4aaf-8c88-591c4827c290 (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 720e7a7e-de4c-4117-856c-30dd1c763bd3 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 786a4cc1-8289-45a7-9bfe-8d14a2de36f1 (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 7ec0abe6-17a3-4aaf-8c88-591c4827c290", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, 720e7a7e-de4c-4117-856c-30dd1c763bd3", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 786a4cc1-8289-45a7-9bfe-8d14a2de36f1", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 7ec0abe6-17a3-4aaf-8c88-591c4827c290 (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, 720e7a7e-de4c-4117-856c-30dd1c763bd3 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 786a4cc1-8289-45a7-9bfe-8d14a2de36f1 (not-active)" ] } } 9396 1727204044.78983: no more pending results, returning what we have 9396 1727204044.78987: results queue empty 9396 1727204044.79002: checking for any_errors_fatal 9396 1727204044.79008: done checking for any_errors_fatal 9396 1727204044.79009: checking for max_fail_percentage 9396 1727204044.79012: done checking for max_fail_percentage 9396 1727204044.79012: checking to see if all hosts have failed and the running result is not ok 9396 1727204044.79014: done checking to see if all hosts have failed 9396 1727204044.79014: getting the remaining hosts for this loop 9396 1727204044.79016: done getting the remaining hosts for this loop 9396 1727204044.79019: getting the next task for host managed-node1 9396 1727204044.79025: done getting next task for host managed-node1 9396 1727204044.79029: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 9396 1727204044.79032: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204044.79043: getting variables 9396 1727204044.79045: in VariableManager get_vars() 9396 1727204044.79082: Calling all_inventory to load vars for managed-node1 9396 1727204044.79085: Calling groups_inventory to load vars for managed-node1 9396 1727204044.79088: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204044.79106: Calling all_plugins_play to load vars for managed-node1 9396 1727204044.79112: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204044.79115: Calling groups_plugins_play to load vars for managed-node1 9396 1727204044.80295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204044.81870: done with get_vars() 9396 1727204044.81894: done getting variables 9396 1727204044.81949: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:04 -0400 (0:00:00.052) 0:00:20.790 ***** 9396 1727204044.81975: entering _queue_task() for managed-node1/debug 9396 1727204044.82229: worker is 1 (out of 1 available) 9396 1727204044.82243: exiting _queue_task() for managed-node1/debug 9396 1727204044.82256: done queuing things up, now waiting for results queue to drain 9396 1727204044.82258: waiting for pending results... 9396 1727204044.82452: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 9396 1727204044.82557: in run() - task 12b410aa-8751-36c5-1f9e-00000000003a 9396 1727204044.82570: variable 'ansible_search_path' from source: unknown 9396 1727204044.82574: variable 'ansible_search_path' from source: unknown 9396 1727204044.82613: calling self._execute() 9396 1727204044.82686: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204044.82694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204044.82706: variable 'omit' from source: magic vars 9396 1727204044.83017: variable 'ansible_distribution_major_version' from source: facts 9396 1727204044.83029: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204044.83132: variable 'network_state' from source: role '' defaults 9396 1727204044.83145: Evaluated conditional (network_state != {}): False 9396 1727204044.83150: when evaluation is False, skipping this task 9396 1727204044.83153: _execute() done 9396 1727204044.83156: dumping result to json 9396 1727204044.83158: done dumping result, returning 9396 1727204044.83168: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-36c5-1f9e-00000000003a] 9396 1727204044.83175: sending task result for task 12b410aa-8751-36c5-1f9e-00000000003a 9396 1727204044.83273: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000003a 9396 1727204044.83275: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 9396 1727204044.83329: no more pending results, returning what we have 9396 1727204044.83333: results queue empty 9396 1727204044.83335: checking for any_errors_fatal 9396 1727204044.83344: done checking for any_errors_fatal 9396 1727204044.83345: checking for max_fail_percentage 9396 1727204044.83347: done checking for max_fail_percentage 9396 1727204044.83348: checking to see if all hosts have failed and the running result is not ok 9396 1727204044.83349: done checking to see if all hosts have failed 9396 1727204044.83350: getting the remaining hosts for this loop 9396 1727204044.83351: done getting the remaining hosts for this loop 9396 1727204044.83356: getting the next task for host managed-node1 9396 1727204044.83362: done getting next task for host managed-node1 9396 1727204044.83366: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 9396 1727204044.83369: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204044.83383: getting variables 9396 1727204044.83385: in VariableManager get_vars() 9396 1727204044.83425: Calling all_inventory to load vars for managed-node1 9396 1727204044.83428: Calling groups_inventory to load vars for managed-node1 9396 1727204044.83431: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204044.83441: Calling all_plugins_play to load vars for managed-node1 9396 1727204044.83444: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204044.83448: Calling groups_plugins_play to load vars for managed-node1 9396 1727204044.84714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204044.86268: done with get_vars() 9396 1727204044.86295: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:04 -0400 (0:00:00.043) 0:00:20.834 ***** 9396 1727204044.86377: entering _queue_task() for managed-node1/ping 9396 1727204044.86378: Creating lock for ping 9396 1727204044.86647: worker is 1 (out of 1 available) 9396 1727204044.86663: exiting _queue_task() for managed-node1/ping 9396 1727204044.86676: done queuing things up, now waiting for results queue to drain 9396 1727204044.86678: waiting for pending results... 9396 1727204044.86870: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 9396 1727204044.86970: in run() - task 12b410aa-8751-36c5-1f9e-00000000003b 9396 1727204044.86984: variable 'ansible_search_path' from source: unknown 9396 1727204044.86988: variable 'ansible_search_path' from source: unknown 9396 1727204044.87031: calling self._execute() 9396 1727204044.87106: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204044.87115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204044.87128: variable 'omit' from source: magic vars 9396 1727204044.87457: variable 'ansible_distribution_major_version' from source: facts 9396 1727204044.87461: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204044.87470: variable 'omit' from source: magic vars 9396 1727204044.87520: variable 'omit' from source: magic vars 9396 1727204044.87549: variable 'omit' from source: magic vars 9396 1727204044.87588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204044.87622: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204044.87640: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204044.87656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204044.87672: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204044.87699: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204044.87703: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204044.87706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204044.87788: Set connection var ansible_timeout to 10 9396 1727204044.87796: Set connection var ansible_shell_executable to /bin/sh 9396 1727204044.87806: Set connection var ansible_pipelining to False 9396 1727204044.87815: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204044.87821: Set connection var ansible_connection to ssh 9396 1727204044.87824: Set connection var ansible_shell_type to sh 9396 1727204044.87847: variable 'ansible_shell_executable' from source: unknown 9396 1727204044.87850: variable 'ansible_connection' from source: unknown 9396 1727204044.87853: variable 'ansible_module_compression' from source: unknown 9396 1727204044.87857: variable 'ansible_shell_type' from source: unknown 9396 1727204044.87859: variable 'ansible_shell_executable' from source: unknown 9396 1727204044.87864: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204044.87871: variable 'ansible_pipelining' from source: unknown 9396 1727204044.87874: variable 'ansible_timeout' from source: unknown 9396 1727204044.87879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204044.88054: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204044.88064: variable 'omit' from source: magic vars 9396 1727204044.88070: starting attempt loop 9396 1727204044.88073: running the handler 9396 1727204044.88087: _low_level_execute_command(): starting 9396 1727204044.88096: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204044.88635: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204044.88640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204044.88643: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204044.88694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204044.88705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204044.88724: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204044.88757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204044.90522: stdout chunk (state=3): >>>/root <<< 9396 1727204044.90617: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204044.90672: stderr chunk (state=3): >>><<< 9396 1727204044.90676: stdout chunk (state=3): >>><<< 9396 1727204044.90698: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204044.90711: _low_level_execute_command(): starting 9396 1727204044.90715: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788 `" && echo ansible-tmp-1727204044.9069724-11227-280050465020788="` echo /root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788 `" ) && sleep 0' 9396 1727204044.91173: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204044.91177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204044.91182: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204044.91193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204044.91243: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204044.91249: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204044.91302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204044.93380: stdout chunk (state=3): >>>ansible-tmp-1727204044.9069724-11227-280050465020788=/root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788 <<< 9396 1727204044.93504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204044.93548: stderr chunk (state=3): >>><<< 9396 1727204044.93551: stdout chunk (state=3): >>><<< 9396 1727204044.93570: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204044.9069724-11227-280050465020788=/root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204044.93612: variable 'ansible_module_compression' from source: unknown 9396 1727204044.93644: ANSIBALLZ: Using lock for ping 9396 1727204044.93647: ANSIBALLZ: Acquiring lock 9396 1727204044.93650: ANSIBALLZ: Lock acquired: 139797138822720 9396 1727204044.93653: ANSIBALLZ: Creating module 9396 1727204045.05334: ANSIBALLZ: Writing module into payload 9396 1727204045.05386: ANSIBALLZ: Writing module 9396 1727204045.05405: ANSIBALLZ: Renaming module 9396 1727204045.05411: ANSIBALLZ: Done creating module 9396 1727204045.05426: variable 'ansible_facts' from source: unknown 9396 1727204045.05474: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788/AnsiballZ_ping.py 9396 1727204045.05591: Sending initial data 9396 1727204045.05595: Sent initial data (152 bytes) 9396 1727204045.06076: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204045.06080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 9396 1727204045.06083: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204045.06085: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204045.06148: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204045.06151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204045.06156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204045.06207: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204045.07911: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 9396 1727204045.07918: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204045.07951: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204045.07993: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmp91l6kk5j /root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788/AnsiballZ_ping.py <<< 9396 1727204045.08005: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788/AnsiballZ_ping.py" <<< 9396 1727204045.08032: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmp91l6kk5j" to remote "/root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788/AnsiballZ_ping.py" <<< 9396 1727204045.08802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204045.08862: stderr chunk (state=3): >>><<< 9396 1727204045.08867: stdout chunk (state=3): >>><<< 9396 1727204045.08892: done transferring module to remote 9396 1727204045.08903: _low_level_execute_command(): starting 9396 1727204045.08911: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788/ /root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788/AnsiballZ_ping.py && sleep 0' 9396 1727204045.09365: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204045.09368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204045.09371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204045.09373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204045.09375: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204045.09439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204045.09443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204045.09446: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204045.09480: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204045.11380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204045.11427: stderr chunk (state=3): >>><<< 9396 1727204045.11431: stdout chunk (state=3): >>><<< 9396 1727204045.11444: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204045.11447: _low_level_execute_command(): starting 9396 1727204045.11453: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788/AnsiballZ_ping.py && sleep 0' 9396 1727204045.11866: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204045.11903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204045.11909: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204045.11912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204045.11914: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204045.11917: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204045.11968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204045.11971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204045.12024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204045.29412: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 9396 1727204045.30849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204045.30915: stderr chunk (state=3): >>><<< 9396 1727204045.30919: stdout chunk (state=3): >>><<< 9396 1727204045.30932: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204045.30957: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204045.30968: _low_level_execute_command(): starting 9396 1727204045.30973: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204044.9069724-11227-280050465020788/ > /dev/null 2>&1 && sleep 0' 9396 1727204045.31475: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204045.31479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204045.31482: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204045.31484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204045.31537: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204045.31541: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204045.31596: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204045.33529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204045.33575: stderr chunk (state=3): >>><<< 9396 1727204045.33578: stdout chunk (state=3): >>><<< 9396 1727204045.33597: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204045.33606: handler run complete 9396 1727204045.33623: attempt loop complete, returning result 9396 1727204045.33626: _execute() done 9396 1727204045.33631: dumping result to json 9396 1727204045.33633: done dumping result, returning 9396 1727204045.33645: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-36c5-1f9e-00000000003b] 9396 1727204045.33650: sending task result for task 12b410aa-8751-36c5-1f9e-00000000003b 9396 1727204045.33748: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000003b 9396 1727204045.33751: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 9396 1727204045.33819: no more pending results, returning what we have 9396 1727204045.33824: results queue empty 9396 1727204045.33825: checking for any_errors_fatal 9396 1727204045.33832: done checking for any_errors_fatal 9396 1727204045.33833: checking for max_fail_percentage 9396 1727204045.33835: done checking for max_fail_percentage 9396 1727204045.33836: checking to see if all hosts have failed and the running result is not ok 9396 1727204045.33837: done checking to see if all hosts have failed 9396 1727204045.33838: getting the remaining hosts for this loop 9396 1727204045.33839: done getting the remaining hosts for this loop 9396 1727204045.33844: getting the next task for host managed-node1 9396 1727204045.33853: done getting next task for host managed-node1 9396 1727204045.33856: ^ task is: TASK: meta (role_complete) 9396 1727204045.33859: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204045.33872: getting variables 9396 1727204045.33873: in VariableManager get_vars() 9396 1727204045.33929: Calling all_inventory to load vars for managed-node1 9396 1727204045.33933: Calling groups_inventory to load vars for managed-node1 9396 1727204045.33936: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204045.33948: Calling all_plugins_play to load vars for managed-node1 9396 1727204045.33951: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204045.33955: Calling groups_plugins_play to load vars for managed-node1 9396 1727204045.35285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204045.36952: done with get_vars() 9396 1727204045.36977: done getting variables 9396 1727204045.37047: done queuing things up, now waiting for results queue to drain 9396 1727204045.37048: results queue empty 9396 1727204045.37049: checking for any_errors_fatal 9396 1727204045.37051: done checking for any_errors_fatal 9396 1727204045.37052: checking for max_fail_percentage 9396 1727204045.37053: done checking for max_fail_percentage 9396 1727204045.37053: checking to see if all hosts have failed and the running result is not ok 9396 1727204045.37054: done checking to see if all hosts have failed 9396 1727204045.37054: getting the remaining hosts for this loop 9396 1727204045.37055: done getting the remaining hosts for this loop 9396 1727204045.37057: getting the next task for host managed-node1 9396 1727204045.37063: done getting next task for host managed-node1 9396 1727204045.37065: ^ task is: TASK: Include the task 'get_interface_stat.yml' 9396 1727204045.37066: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204045.37068: getting variables 9396 1727204045.37070: in VariableManager get_vars() 9396 1727204045.37082: Calling all_inventory to load vars for managed-node1 9396 1727204045.37083: Calling groups_inventory to load vars for managed-node1 9396 1727204045.37085: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204045.37092: Calling all_plugins_play to load vars for managed-node1 9396 1727204045.37094: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204045.37096: Calling groups_plugins_play to load vars for managed-node1 9396 1727204045.38527: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204045.41373: done with get_vars() 9396 1727204045.41418: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:54:05 -0400 (0:00:00.551) 0:00:21.385 ***** 9396 1727204045.41486: entering _queue_task() for managed-node1/include_tasks 9396 1727204045.41766: worker is 1 (out of 1 available) 9396 1727204045.41783: exiting _queue_task() for managed-node1/include_tasks 9396 1727204045.41799: done queuing things up, now waiting for results queue to drain 9396 1727204045.41801: waiting for pending results... 9396 1727204045.41999: running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' 9396 1727204045.42104: in run() - task 12b410aa-8751-36c5-1f9e-00000000006e 9396 1727204045.42119: variable 'ansible_search_path' from source: unknown 9396 1727204045.42123: variable 'ansible_search_path' from source: unknown 9396 1727204045.42158: calling self._execute() 9396 1727204045.42235: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204045.42247: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204045.42255: variable 'omit' from source: magic vars 9396 1727204045.42580: variable 'ansible_distribution_major_version' from source: facts 9396 1727204045.42588: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204045.42594: _execute() done 9396 1727204045.42600: dumping result to json 9396 1727204045.42604: done dumping result, returning 9396 1727204045.42687: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-36c5-1f9e-00000000006e] 9396 1727204045.42695: sending task result for task 12b410aa-8751-36c5-1f9e-00000000006e 9396 1727204045.42771: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000006e 9396 1727204045.42774: WORKER PROCESS EXITING 9396 1727204045.42815: no more pending results, returning what we have 9396 1727204045.42820: in VariableManager get_vars() 9396 1727204045.42861: Calling all_inventory to load vars for managed-node1 9396 1727204045.42864: Calling groups_inventory to load vars for managed-node1 9396 1727204045.42866: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204045.42878: Calling all_plugins_play to load vars for managed-node1 9396 1727204045.42881: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204045.42885: Calling groups_plugins_play to load vars for managed-node1 9396 1727204045.44962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204045.48196: done with get_vars() 9396 1727204045.48237: variable 'ansible_search_path' from source: unknown 9396 1727204045.48238: variable 'ansible_search_path' from source: unknown 9396 1727204045.48286: we have included files to process 9396 1727204045.48287: generating all_blocks data 9396 1727204045.48292: done generating all_blocks data 9396 1727204045.48297: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 9396 1727204045.48299: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 9396 1727204045.48302: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 9396 1727204045.48547: done processing included file 9396 1727204045.48550: iterating over new_blocks loaded from include file 9396 1727204045.48552: in VariableManager get_vars() 9396 1727204045.48577: done with get_vars() 9396 1727204045.48579: filtering new block on tags 9396 1727204045.48602: done filtering new block on tags 9396 1727204045.48605: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node1 9396 1727204045.48613: extending task lists for all hosts with included blocks 9396 1727204045.48746: done extending task lists 9396 1727204045.48747: done processing included files 9396 1727204045.48748: results queue empty 9396 1727204045.48749: checking for any_errors_fatal 9396 1727204045.48751: done checking for any_errors_fatal 9396 1727204045.48752: checking for max_fail_percentage 9396 1727204045.48753: done checking for max_fail_percentage 9396 1727204045.48754: checking to see if all hosts have failed and the running result is not ok 9396 1727204045.48755: done checking to see if all hosts have failed 9396 1727204045.48756: getting the remaining hosts for this loop 9396 1727204045.48757: done getting the remaining hosts for this loop 9396 1727204045.48760: getting the next task for host managed-node1 9396 1727204045.48764: done getting next task for host managed-node1 9396 1727204045.48766: ^ task is: TASK: Get stat for interface {{ interface }} 9396 1727204045.48768: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204045.48771: getting variables 9396 1727204045.48772: in VariableManager get_vars() 9396 1727204045.48787: Calling all_inventory to load vars for managed-node1 9396 1727204045.48792: Calling groups_inventory to load vars for managed-node1 9396 1727204045.48794: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204045.48800: Calling all_plugins_play to load vars for managed-node1 9396 1727204045.48803: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204045.48806: Calling groups_plugins_play to load vars for managed-node1 9396 1727204045.54925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204045.56604: done with get_vars() 9396 1727204045.56633: done getting variables 9396 1727204045.56759: variable 'interface' from source: task vars 9396 1727204045.56762: variable 'controller_device' from source: play vars 9396 1727204045.56815: variable 'controller_device' from source: play vars TASK [Get stat for interface deprecated-bond] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:54:05 -0400 (0:00:00.153) 0:00:21.539 ***** 9396 1727204045.56841: entering _queue_task() for managed-node1/stat 9396 1727204045.57125: worker is 1 (out of 1 available) 9396 1727204045.57141: exiting _queue_task() for managed-node1/stat 9396 1727204045.57154: done queuing things up, now waiting for results queue to drain 9396 1727204045.57156: waiting for pending results... 9396 1727204045.57545: running TaskExecutor() for managed-node1/TASK: Get stat for interface deprecated-bond 9396 1727204045.57654: in run() - task 12b410aa-8751-36c5-1f9e-000000000242 9396 1727204045.57677: variable 'ansible_search_path' from source: unknown 9396 1727204045.57687: variable 'ansible_search_path' from source: unknown 9396 1727204045.57747: calling self._execute() 9396 1727204045.57872: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204045.57887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204045.57910: variable 'omit' from source: magic vars 9396 1727204045.58403: variable 'ansible_distribution_major_version' from source: facts 9396 1727204045.58426: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204045.58439: variable 'omit' from source: magic vars 9396 1727204045.58533: variable 'omit' from source: magic vars 9396 1727204045.58668: variable 'interface' from source: task vars 9396 1727204045.58680: variable 'controller_device' from source: play vars 9396 1727204045.58779: variable 'controller_device' from source: play vars 9396 1727204045.58823: variable 'omit' from source: magic vars 9396 1727204045.58880: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204045.58947: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204045.58976: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204045.59009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204045.59040: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204045.59086: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204045.59100: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204045.59113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204045.59271: Set connection var ansible_timeout to 10 9396 1727204045.59286: Set connection var ansible_shell_executable to /bin/sh 9396 1727204045.59304: Set connection var ansible_pipelining to False 9396 1727204045.59352: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204045.59361: Set connection var ansible_connection to ssh 9396 1727204045.59365: Set connection var ansible_shell_type to sh 9396 1727204045.59393: variable 'ansible_shell_executable' from source: unknown 9396 1727204045.59404: variable 'ansible_connection' from source: unknown 9396 1727204045.59462: variable 'ansible_module_compression' from source: unknown 9396 1727204045.59470: variable 'ansible_shell_type' from source: unknown 9396 1727204045.59473: variable 'ansible_shell_executable' from source: unknown 9396 1727204045.59475: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204045.59483: variable 'ansible_pipelining' from source: unknown 9396 1727204045.59486: variable 'ansible_timeout' from source: unknown 9396 1727204045.59488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204045.59746: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204045.59764: variable 'omit' from source: magic vars 9396 1727204045.59776: starting attempt loop 9396 1727204045.59799: running the handler 9396 1727204045.59895: _low_level_execute_command(): starting 9396 1727204045.59899: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204045.60713: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204045.60812: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204045.60841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204045.60927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204045.62701: stdout chunk (state=3): >>>/root <<< 9396 1727204045.62904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204045.62910: stdout chunk (state=3): >>><<< 9396 1727204045.62913: stderr chunk (state=3): >>><<< 9396 1727204045.63036: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204045.63040: _low_level_execute_command(): starting 9396 1727204045.63043: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616 `" && echo ansible-tmp-1727204045.6293669-11247-98538502221616="` echo /root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616 `" ) && sleep 0' 9396 1727204045.63625: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204045.63647: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204045.63663: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204045.63683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204045.63705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204045.63750: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204045.63779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204045.63860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204045.63893: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204045.63923: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204045.64004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204045.66073: stdout chunk (state=3): >>>ansible-tmp-1727204045.6293669-11247-98538502221616=/root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616 <<< 9396 1727204045.66268: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204045.66288: stderr chunk (state=3): >>><<< 9396 1727204045.66301: stdout chunk (state=3): >>><<< 9396 1727204045.66330: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204045.6293669-11247-98538502221616=/root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204045.66399: variable 'ansible_module_compression' from source: unknown 9396 1727204045.66470: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9396 1727204045.66532: variable 'ansible_facts' from source: unknown 9396 1727204045.66639: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616/AnsiballZ_stat.py 9396 1727204045.66921: Sending initial data 9396 1727204045.66924: Sent initial data (151 bytes) 9396 1727204045.67483: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 9396 1727204045.67544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204045.67551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204045.67587: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204045.69270: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204045.69320: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204045.69364: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpiijf7iq5 /root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616/AnsiballZ_stat.py <<< 9396 1727204045.69385: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616/AnsiballZ_stat.py" <<< 9396 1727204045.69417: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpiijf7iq5" to remote "/root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616/AnsiballZ_stat.py" <<< 9396 1727204045.70696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204045.70700: stderr chunk (state=3): >>><<< 9396 1727204045.70702: stdout chunk (state=3): >>><<< 9396 1727204045.70705: done transferring module to remote 9396 1727204045.70707: _low_level_execute_command(): starting 9396 1727204045.70709: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616/ /root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616/AnsiballZ_stat.py && sleep 0' 9396 1727204045.71485: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204045.71505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204045.71538: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204045.71558: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204045.71584: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204045.71670: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204045.73726: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204045.73734: stdout chunk (state=3): >>><<< 9396 1727204045.73736: stderr chunk (state=3): >>><<< 9396 1727204045.73850: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204045.73854: _low_level_execute_command(): starting 9396 1727204045.73857: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616/AnsiballZ_stat.py && sleep 0' 9396 1727204045.74448: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204045.74463: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204045.74476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204045.74498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204045.74524: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204045.74606: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204045.74659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204045.74692: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204045.74779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204045.92461: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34888, "dev": 23, "nlink": 1, "atime": 1727204044.4339395, "mtime": 1727204044.4339395, "ctime": 1727204044.4339395, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 9396 1727204045.93951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204045.94005: stderr chunk (state=3): >>><<< 9396 1727204045.94009: stdout chunk (state=3): >>><<< 9396 1727204045.94028: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/deprecated-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 34888, "dev": 23, "nlink": 1, "atime": 1727204044.4339395, "mtime": 1727204044.4339395, "ctime": 1727204044.4339395, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/deprecated-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204045.94081: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204045.94093: _low_level_execute_command(): starting 9396 1727204045.94099: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204045.6293669-11247-98538502221616/ > /dev/null 2>&1 && sleep 0' 9396 1727204045.94550: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204045.94555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204045.94562: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204045.94565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204045.94624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204045.94627: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204045.94666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204045.96668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204045.96717: stderr chunk (state=3): >>><<< 9396 1727204045.96721: stdout chunk (state=3): >>><<< 9396 1727204045.96735: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204045.96743: handler run complete 9396 1727204045.96793: attempt loop complete, returning result 9396 1727204045.96797: _execute() done 9396 1727204045.96799: dumping result to json 9396 1727204045.96809: done dumping result, returning 9396 1727204045.96816: done running TaskExecutor() for managed-node1/TASK: Get stat for interface deprecated-bond [12b410aa-8751-36c5-1f9e-000000000242] 9396 1727204045.96821: sending task result for task 12b410aa-8751-36c5-1f9e-000000000242 9396 1727204045.96941: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000242 9396 1727204045.96944: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "atime": 1727204044.4339395, "block_size": 4096, "blocks": 0, "ctime": 1727204044.4339395, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 34888, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/deprecated-bond", "lnk_target": "../../devices/virtual/net/deprecated-bond", "mode": "0777", "mtime": 1727204044.4339395, "nlink": 1, "path": "/sys/class/net/deprecated-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 9396 1727204045.97073: no more pending results, returning what we have 9396 1727204045.97077: results queue empty 9396 1727204045.97078: checking for any_errors_fatal 9396 1727204045.97080: done checking for any_errors_fatal 9396 1727204045.97081: checking for max_fail_percentage 9396 1727204045.97083: done checking for max_fail_percentage 9396 1727204045.97084: checking to see if all hosts have failed and the running result is not ok 9396 1727204045.97085: done checking to see if all hosts have failed 9396 1727204045.97086: getting the remaining hosts for this loop 9396 1727204045.97087: done getting the remaining hosts for this loop 9396 1727204045.97094: getting the next task for host managed-node1 9396 1727204045.97102: done getting next task for host managed-node1 9396 1727204045.97105: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 9396 1727204045.97110: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204045.97115: getting variables 9396 1727204045.97116: in VariableManager get_vars() 9396 1727204045.97154: Calling all_inventory to load vars for managed-node1 9396 1727204045.97157: Calling groups_inventory to load vars for managed-node1 9396 1727204045.97160: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204045.97179: Calling all_plugins_play to load vars for managed-node1 9396 1727204045.97182: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204045.97186: Calling groups_plugins_play to load vars for managed-node1 9396 1727204045.98396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204045.99975: done with get_vars() 9396 1727204046.00003: done getting variables 9396 1727204046.00056: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204046.00158: variable 'interface' from source: task vars 9396 1727204046.00161: variable 'controller_device' from source: play vars 9396 1727204046.00214: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'deprecated-bond'] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.433) 0:00:21.973 ***** 9396 1727204046.00241: entering _queue_task() for managed-node1/assert 9396 1727204046.00513: worker is 1 (out of 1 available) 9396 1727204046.00527: exiting _queue_task() for managed-node1/assert 9396 1727204046.00539: done queuing things up, now waiting for results queue to drain 9396 1727204046.00541: waiting for pending results... 9396 1727204046.00733: running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'deprecated-bond' 9396 1727204046.00829: in run() - task 12b410aa-8751-36c5-1f9e-00000000006f 9396 1727204046.00840: variable 'ansible_search_path' from source: unknown 9396 1727204046.00843: variable 'ansible_search_path' from source: unknown 9396 1727204046.00880: calling self._execute() 9396 1727204046.00956: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.00962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.00974: variable 'omit' from source: magic vars 9396 1727204046.01282: variable 'ansible_distribution_major_version' from source: facts 9396 1727204046.01294: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204046.01301: variable 'omit' from source: magic vars 9396 1727204046.01344: variable 'omit' from source: magic vars 9396 1727204046.01427: variable 'interface' from source: task vars 9396 1727204046.01433: variable 'controller_device' from source: play vars 9396 1727204046.01482: variable 'controller_device' from source: play vars 9396 1727204046.01501: variable 'omit' from source: magic vars 9396 1727204046.01540: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204046.01574: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204046.01595: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204046.01647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204046.01653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204046.01657: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204046.01659: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.01662: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.01744: Set connection var ansible_timeout to 10 9396 1727204046.01750: Set connection var ansible_shell_executable to /bin/sh 9396 1727204046.01762: Set connection var ansible_pipelining to False 9396 1727204046.01768: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204046.01777: Set connection var ansible_connection to ssh 9396 1727204046.01780: Set connection var ansible_shell_type to sh 9396 1727204046.01804: variable 'ansible_shell_executable' from source: unknown 9396 1727204046.01810: variable 'ansible_connection' from source: unknown 9396 1727204046.01813: variable 'ansible_module_compression' from source: unknown 9396 1727204046.01815: variable 'ansible_shell_type' from source: unknown 9396 1727204046.01818: variable 'ansible_shell_executable' from source: unknown 9396 1727204046.01821: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.01826: variable 'ansible_pipelining' from source: unknown 9396 1727204046.01829: variable 'ansible_timeout' from source: unknown 9396 1727204046.01834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.01956: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204046.01966: variable 'omit' from source: magic vars 9396 1727204046.01978: starting attempt loop 9396 1727204046.01981: running the handler 9396 1727204046.02092: variable 'interface_stat' from source: set_fact 9396 1727204046.02113: Evaluated conditional (interface_stat.stat.exists): True 9396 1727204046.02117: handler run complete 9396 1727204046.02133: attempt loop complete, returning result 9396 1727204046.02136: _execute() done 9396 1727204046.02139: dumping result to json 9396 1727204046.02142: done dumping result, returning 9396 1727204046.02149: done running TaskExecutor() for managed-node1/TASK: Assert that the interface is present - 'deprecated-bond' [12b410aa-8751-36c5-1f9e-00000000006f] 9396 1727204046.02155: sending task result for task 12b410aa-8751-36c5-1f9e-00000000006f 9396 1727204046.02247: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000006f 9396 1727204046.02250: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 9396 1727204046.02306: no more pending results, returning what we have 9396 1727204046.02312: results queue empty 9396 1727204046.02313: checking for any_errors_fatal 9396 1727204046.02325: done checking for any_errors_fatal 9396 1727204046.02325: checking for max_fail_percentage 9396 1727204046.02327: done checking for max_fail_percentage 9396 1727204046.02329: checking to see if all hosts have failed and the running result is not ok 9396 1727204046.02330: done checking to see if all hosts have failed 9396 1727204046.02332: getting the remaining hosts for this loop 9396 1727204046.02333: done getting the remaining hosts for this loop 9396 1727204046.02337: getting the next task for host managed-node1 9396 1727204046.02344: done getting next task for host managed-node1 9396 1727204046.02348: ^ task is: TASK: Include the task 'assert_profile_present.yml' 9396 1727204046.02350: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204046.02354: getting variables 9396 1727204046.02355: in VariableManager get_vars() 9396 1727204046.02403: Calling all_inventory to load vars for managed-node1 9396 1727204046.02406: Calling groups_inventory to load vars for managed-node1 9396 1727204046.02412: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204046.02422: Calling all_plugins_play to load vars for managed-node1 9396 1727204046.02426: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204046.02429: Calling groups_plugins_play to load vars for managed-node1 9396 1727204046.03739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204046.06247: done with get_vars() 9396 1727204046.06281: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:67 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.061) 0:00:22.034 ***** 9396 1727204046.06405: entering _queue_task() for managed-node1/include_tasks 9396 1727204046.06763: worker is 1 (out of 1 available) 9396 1727204046.06776: exiting _queue_task() for managed-node1/include_tasks 9396 1727204046.06792: done queuing things up, now waiting for results queue to drain 9396 1727204046.06794: waiting for pending results... 9396 1727204046.07217: running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' 9396 1727204046.07249: in run() - task 12b410aa-8751-36c5-1f9e-000000000070 9396 1727204046.07271: variable 'ansible_search_path' from source: unknown 9396 1727204046.07340: variable 'controller_profile' from source: play vars 9396 1727204046.07572: variable 'controller_profile' from source: play vars 9396 1727204046.07598: variable 'port1_profile' from source: play vars 9396 1727204046.07696: variable 'port1_profile' from source: play vars 9396 1727204046.07715: variable 'port2_profile' from source: play vars 9396 1727204046.07810: variable 'port2_profile' from source: play vars 9396 1727204046.07834: variable 'omit' from source: magic vars 9396 1727204046.08075: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.08080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.08083: variable 'omit' from source: magic vars 9396 1727204046.08410: variable 'ansible_distribution_major_version' from source: facts 9396 1727204046.08431: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204046.08473: variable 'item' from source: unknown 9396 1727204046.08566: variable 'item' from source: unknown 9396 1727204046.08899: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.08903: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.08905: variable 'omit' from source: magic vars 9396 1727204046.09017: variable 'ansible_distribution_major_version' from source: facts 9396 1727204046.09022: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204046.09045: variable 'item' from source: unknown 9396 1727204046.09099: variable 'item' from source: unknown 9396 1727204046.09180: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.09190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.09201: variable 'omit' from source: magic vars 9396 1727204046.09338: variable 'ansible_distribution_major_version' from source: facts 9396 1727204046.09344: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204046.09365: variable 'item' from source: unknown 9396 1727204046.09421: variable 'item' from source: unknown 9396 1727204046.09491: dumping result to json 9396 1727204046.09495: done dumping result, returning 9396 1727204046.09498: done running TaskExecutor() for managed-node1/TASK: Include the task 'assert_profile_present.yml' [12b410aa-8751-36c5-1f9e-000000000070] 9396 1727204046.09502: sending task result for task 12b410aa-8751-36c5-1f9e-000000000070 9396 1727204046.09543: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000070 9396 1727204046.09546: WORKER PROCESS EXITING 9396 1727204046.09580: no more pending results, returning what we have 9396 1727204046.09585: in VariableManager get_vars() 9396 1727204046.09635: Calling all_inventory to load vars for managed-node1 9396 1727204046.09638: Calling groups_inventory to load vars for managed-node1 9396 1727204046.09641: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204046.09654: Calling all_plugins_play to load vars for managed-node1 9396 1727204046.09658: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204046.09661: Calling groups_plugins_play to load vars for managed-node1 9396 1727204046.11002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204046.12559: done with get_vars() 9396 1727204046.12578: variable 'ansible_search_path' from source: unknown 9396 1727204046.12592: variable 'ansible_search_path' from source: unknown 9396 1727204046.12600: variable 'ansible_search_path' from source: unknown 9396 1727204046.12605: we have included files to process 9396 1727204046.12606: generating all_blocks data 9396 1727204046.12609: done generating all_blocks data 9396 1727204046.12614: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 9396 1727204046.12615: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 9396 1727204046.12616: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 9396 1727204046.12775: in VariableManager get_vars() 9396 1727204046.12797: done with get_vars() 9396 1727204046.13015: done processing included file 9396 1727204046.13017: iterating over new_blocks loaded from include file 9396 1727204046.13018: in VariableManager get_vars() 9396 1727204046.13032: done with get_vars() 9396 1727204046.13034: filtering new block on tags 9396 1727204046.13051: done filtering new block on tags 9396 1727204046.13053: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0) 9396 1727204046.13058: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 9396 1727204046.13059: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 9396 1727204046.13061: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 9396 1727204046.13143: in VariableManager get_vars() 9396 1727204046.13162: done with get_vars() 9396 1727204046.13355: done processing included file 9396 1727204046.13357: iterating over new_blocks loaded from include file 9396 1727204046.13358: in VariableManager get_vars() 9396 1727204046.13373: done with get_vars() 9396 1727204046.13375: filtering new block on tags 9396 1727204046.13392: done filtering new block on tags 9396 1727204046.13395: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0.0) 9396 1727204046.13397: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 9396 1727204046.13398: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 9396 1727204046.13400: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 9396 1727204046.13536: in VariableManager get_vars() 9396 1727204046.13554: done with get_vars() 9396 1727204046.13745: done processing included file 9396 1727204046.13747: iterating over new_blocks loaded from include file 9396 1727204046.13748: in VariableManager get_vars() 9396 1727204046.13761: done with get_vars() 9396 1727204046.13762: filtering new block on tags 9396 1727204046.13777: done filtering new block on tags 9396 1727204046.13778: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed-node1 => (item=bond0.1) 9396 1727204046.13781: extending task lists for all hosts with included blocks 9396 1727204046.15952: done extending task lists 9396 1727204046.15959: done processing included files 9396 1727204046.15960: results queue empty 9396 1727204046.15961: checking for any_errors_fatal 9396 1727204046.15963: done checking for any_errors_fatal 9396 1727204046.15964: checking for max_fail_percentage 9396 1727204046.15964: done checking for max_fail_percentage 9396 1727204046.15965: checking to see if all hosts have failed and the running result is not ok 9396 1727204046.15966: done checking to see if all hosts have failed 9396 1727204046.15966: getting the remaining hosts for this loop 9396 1727204046.15967: done getting the remaining hosts for this loop 9396 1727204046.15969: getting the next task for host managed-node1 9396 1727204046.15972: done getting next task for host managed-node1 9396 1727204046.15974: ^ task is: TASK: Include the task 'get_profile_stat.yml' 9396 1727204046.15977: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204046.15979: getting variables 9396 1727204046.15980: in VariableManager get_vars() 9396 1727204046.15993: Calling all_inventory to load vars for managed-node1 9396 1727204046.15994: Calling groups_inventory to load vars for managed-node1 9396 1727204046.15997: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204046.16002: Calling all_plugins_play to load vars for managed-node1 9396 1727204046.16004: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204046.16006: Calling groups_plugins_play to load vars for managed-node1 9396 1727204046.17051: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204046.18681: done with get_vars() 9396 1727204046.18705: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.123) 0:00:22.158 ***** 9396 1727204046.18767: entering _queue_task() for managed-node1/include_tasks 9396 1727204046.19047: worker is 1 (out of 1 available) 9396 1727204046.19061: exiting _queue_task() for managed-node1/include_tasks 9396 1727204046.19075: done queuing things up, now waiting for results queue to drain 9396 1727204046.19077: waiting for pending results... 9396 1727204046.19264: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 9396 1727204046.19339: in run() - task 12b410aa-8751-36c5-1f9e-000000000260 9396 1727204046.19352: variable 'ansible_search_path' from source: unknown 9396 1727204046.19356: variable 'ansible_search_path' from source: unknown 9396 1727204046.19388: calling self._execute() 9396 1727204046.19467: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.19473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.19484: variable 'omit' from source: magic vars 9396 1727204046.19805: variable 'ansible_distribution_major_version' from source: facts 9396 1727204046.19817: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204046.19824: _execute() done 9396 1727204046.19827: dumping result to json 9396 1727204046.19833: done dumping result, returning 9396 1727204046.19839: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-36c5-1f9e-000000000260] 9396 1727204046.19847: sending task result for task 12b410aa-8751-36c5-1f9e-000000000260 9396 1727204046.19944: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000260 9396 1727204046.19947: WORKER PROCESS EXITING 9396 1727204046.19985: no more pending results, returning what we have 9396 1727204046.19992: in VariableManager get_vars() 9396 1727204046.20040: Calling all_inventory to load vars for managed-node1 9396 1727204046.20043: Calling groups_inventory to load vars for managed-node1 9396 1727204046.20046: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204046.20060: Calling all_plugins_play to load vars for managed-node1 9396 1727204046.20063: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204046.20067: Calling groups_plugins_play to load vars for managed-node1 9396 1727204046.21278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204046.22843: done with get_vars() 9396 1727204046.22864: variable 'ansible_search_path' from source: unknown 9396 1727204046.22865: variable 'ansible_search_path' from source: unknown 9396 1727204046.22903: we have included files to process 9396 1727204046.22905: generating all_blocks data 9396 1727204046.22906: done generating all_blocks data 9396 1727204046.22909: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 9396 1727204046.22910: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 9396 1727204046.22912: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 9396 1727204046.23780: done processing included file 9396 1727204046.23781: iterating over new_blocks loaded from include file 9396 1727204046.23783: in VariableManager get_vars() 9396 1727204046.23801: done with get_vars() 9396 1727204046.23802: filtering new block on tags 9396 1727204046.23824: done filtering new block on tags 9396 1727204046.23826: in VariableManager get_vars() 9396 1727204046.23841: done with get_vars() 9396 1727204046.23842: filtering new block on tags 9396 1727204046.23858: done filtering new block on tags 9396 1727204046.23860: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 9396 1727204046.23864: extending task lists for all hosts with included blocks 9396 1727204046.24009: done extending task lists 9396 1727204046.24010: done processing included files 9396 1727204046.24011: results queue empty 9396 1727204046.24011: checking for any_errors_fatal 9396 1727204046.24014: done checking for any_errors_fatal 9396 1727204046.24014: checking for max_fail_percentage 9396 1727204046.24015: done checking for max_fail_percentage 9396 1727204046.24016: checking to see if all hosts have failed and the running result is not ok 9396 1727204046.24016: done checking to see if all hosts have failed 9396 1727204046.24017: getting the remaining hosts for this loop 9396 1727204046.24018: done getting the remaining hosts for this loop 9396 1727204046.24020: getting the next task for host managed-node1 9396 1727204046.24023: done getting next task for host managed-node1 9396 1727204046.24025: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 9396 1727204046.24027: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204046.24029: getting variables 9396 1727204046.24029: in VariableManager get_vars() 9396 1727204046.24091: Calling all_inventory to load vars for managed-node1 9396 1727204046.24094: Calling groups_inventory to load vars for managed-node1 9396 1727204046.24097: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204046.24103: Calling all_plugins_play to load vars for managed-node1 9396 1727204046.24105: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204046.24109: Calling groups_plugins_play to load vars for managed-node1 9396 1727204046.25192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204046.26729: done with get_vars() 9396 1727204046.26750: done getting variables 9396 1727204046.26783: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.080) 0:00:22.239 ***** 9396 1727204046.26812: entering _queue_task() for managed-node1/set_fact 9396 1727204046.27088: worker is 1 (out of 1 available) 9396 1727204046.27105: exiting _queue_task() for managed-node1/set_fact 9396 1727204046.27120: done queuing things up, now waiting for results queue to drain 9396 1727204046.27122: waiting for pending results... 9396 1727204046.27302: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 9396 1727204046.27387: in run() - task 12b410aa-8751-36c5-1f9e-0000000003b3 9396 1727204046.27401: variable 'ansible_search_path' from source: unknown 9396 1727204046.27405: variable 'ansible_search_path' from source: unknown 9396 1727204046.27439: calling self._execute() 9396 1727204046.27574: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.27578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.27582: variable 'omit' from source: magic vars 9396 1727204046.27858: variable 'ansible_distribution_major_version' from source: facts 9396 1727204046.27869: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204046.27876: variable 'omit' from source: magic vars 9396 1727204046.27921: variable 'omit' from source: magic vars 9396 1727204046.27953: variable 'omit' from source: magic vars 9396 1727204046.27988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204046.28027: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204046.28044: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204046.28061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204046.28073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204046.28103: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204046.28110: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.28114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.28194: Set connection var ansible_timeout to 10 9396 1727204046.28201: Set connection var ansible_shell_executable to /bin/sh 9396 1727204046.28212: Set connection var ansible_pipelining to False 9396 1727204046.28217: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204046.28225: Set connection var ansible_connection to ssh 9396 1727204046.28228: Set connection var ansible_shell_type to sh 9396 1727204046.28254: variable 'ansible_shell_executable' from source: unknown 9396 1727204046.28257: variable 'ansible_connection' from source: unknown 9396 1727204046.28261: variable 'ansible_module_compression' from source: unknown 9396 1727204046.28263: variable 'ansible_shell_type' from source: unknown 9396 1727204046.28268: variable 'ansible_shell_executable' from source: unknown 9396 1727204046.28272: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.28277: variable 'ansible_pipelining' from source: unknown 9396 1727204046.28279: variable 'ansible_timeout' from source: unknown 9396 1727204046.28285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.28405: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204046.28416: variable 'omit' from source: magic vars 9396 1727204046.28423: starting attempt loop 9396 1727204046.28426: running the handler 9396 1727204046.28438: handler run complete 9396 1727204046.28450: attempt loop complete, returning result 9396 1727204046.28453: _execute() done 9396 1727204046.28456: dumping result to json 9396 1727204046.28460: done dumping result, returning 9396 1727204046.28471: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-36c5-1f9e-0000000003b3] 9396 1727204046.28474: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003b3 9396 1727204046.28558: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003b3 9396 1727204046.28561: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 9396 1727204046.28631: no more pending results, returning what we have 9396 1727204046.28636: results queue empty 9396 1727204046.28637: checking for any_errors_fatal 9396 1727204046.28639: done checking for any_errors_fatal 9396 1727204046.28640: checking for max_fail_percentage 9396 1727204046.28641: done checking for max_fail_percentage 9396 1727204046.28642: checking to see if all hosts have failed and the running result is not ok 9396 1727204046.28644: done checking to see if all hosts have failed 9396 1727204046.28645: getting the remaining hosts for this loop 9396 1727204046.28646: done getting the remaining hosts for this loop 9396 1727204046.28650: getting the next task for host managed-node1 9396 1727204046.28657: done getting next task for host managed-node1 9396 1727204046.28659: ^ task is: TASK: Stat profile file 9396 1727204046.28663: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204046.28667: getting variables 9396 1727204046.28669: in VariableManager get_vars() 9396 1727204046.28709: Calling all_inventory to load vars for managed-node1 9396 1727204046.28712: Calling groups_inventory to load vars for managed-node1 9396 1727204046.28715: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204046.28726: Calling all_plugins_play to load vars for managed-node1 9396 1727204046.28729: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204046.28733: Calling groups_plugins_play to load vars for managed-node1 9396 1727204046.30066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204046.31634: done with get_vars() 9396 1727204046.31658: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.049) 0:00:22.288 ***** 9396 1727204046.31734: entering _queue_task() for managed-node1/stat 9396 1727204046.31992: worker is 1 (out of 1 available) 9396 1727204046.32006: exiting _queue_task() for managed-node1/stat 9396 1727204046.32022: done queuing things up, now waiting for results queue to drain 9396 1727204046.32024: waiting for pending results... 9396 1727204046.32200: running TaskExecutor() for managed-node1/TASK: Stat profile file 9396 1727204046.32283: in run() - task 12b410aa-8751-36c5-1f9e-0000000003b4 9396 1727204046.32298: variable 'ansible_search_path' from source: unknown 9396 1727204046.32302: variable 'ansible_search_path' from source: unknown 9396 1727204046.32334: calling self._execute() 9396 1727204046.32412: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.32416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.32428: variable 'omit' from source: magic vars 9396 1727204046.32740: variable 'ansible_distribution_major_version' from source: facts 9396 1727204046.32751: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204046.32758: variable 'omit' from source: magic vars 9396 1727204046.32799: variable 'omit' from source: magic vars 9396 1727204046.32878: variable 'profile' from source: include params 9396 1727204046.32882: variable 'item' from source: include params 9396 1727204046.32942: variable 'item' from source: include params 9396 1727204046.32958: variable 'omit' from source: magic vars 9396 1727204046.32994: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204046.33034: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204046.33053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204046.33069: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204046.33080: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204046.33113: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204046.33116: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.33120: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.33201: Set connection var ansible_timeout to 10 9396 1727204046.33210: Set connection var ansible_shell_executable to /bin/sh 9396 1727204046.33218: Set connection var ansible_pipelining to False 9396 1727204046.33225: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204046.33234: Set connection var ansible_connection to ssh 9396 1727204046.33238: Set connection var ansible_shell_type to sh 9396 1727204046.33262: variable 'ansible_shell_executable' from source: unknown 9396 1727204046.33265: variable 'ansible_connection' from source: unknown 9396 1727204046.33268: variable 'ansible_module_compression' from source: unknown 9396 1727204046.33271: variable 'ansible_shell_type' from source: unknown 9396 1727204046.33275: variable 'ansible_shell_executable' from source: unknown 9396 1727204046.33279: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.33284: variable 'ansible_pipelining' from source: unknown 9396 1727204046.33287: variable 'ansible_timeout' from source: unknown 9396 1727204046.33295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.33461: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204046.33474: variable 'omit' from source: magic vars 9396 1727204046.33477: starting attempt loop 9396 1727204046.33480: running the handler 9396 1727204046.33495: _low_level_execute_command(): starting 9396 1727204046.33503: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204046.34052: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204046.34056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.34060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204046.34064: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.34115: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204046.34118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204046.34121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204046.34174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204046.35897: stdout chunk (state=3): >>>/root <<< 9396 1727204046.36004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204046.36056: stderr chunk (state=3): >>><<< 9396 1727204046.36060: stdout chunk (state=3): >>><<< 9396 1727204046.36081: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204046.36093: _low_level_execute_command(): starting 9396 1727204046.36104: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203 `" && echo ansible-tmp-1727204046.3608031-11270-130060459501203="` echo /root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203 `" ) && sleep 0' 9396 1727204046.36560: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204046.36564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.36574: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204046.36576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.36630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204046.36634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204046.36675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204046.38665: stdout chunk (state=3): >>>ansible-tmp-1727204046.3608031-11270-130060459501203=/root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203 <<< 9396 1727204046.38784: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204046.38837: stderr chunk (state=3): >>><<< 9396 1727204046.38842: stdout chunk (state=3): >>><<< 9396 1727204046.38853: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204046.3608031-11270-130060459501203=/root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204046.38894: variable 'ansible_module_compression' from source: unknown 9396 1727204046.38946: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9396 1727204046.38977: variable 'ansible_facts' from source: unknown 9396 1727204046.39030: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203/AnsiballZ_stat.py 9396 1727204046.39144: Sending initial data 9396 1727204046.39147: Sent initial data (152 bytes) 9396 1727204046.39606: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204046.39612: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.39614: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204046.39617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204046.39619: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.39668: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204046.39674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204046.39716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204046.41345: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 9396 1727204046.41349: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204046.41383: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204046.41423: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmptdnay78a /root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203/AnsiballZ_stat.py <<< 9396 1727204046.41427: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203/AnsiballZ_stat.py" <<< 9396 1727204046.41460: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmptdnay78a" to remote "/root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203/AnsiballZ_stat.py" <<< 9396 1727204046.42241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204046.42296: stderr chunk (state=3): >>><<< 9396 1727204046.42300: stdout chunk (state=3): >>><<< 9396 1727204046.42319: done transferring module to remote 9396 1727204046.42329: _low_level_execute_command(): starting 9396 1727204046.42334: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203/ /root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203/AnsiballZ_stat.py && sleep 0' 9396 1727204046.42769: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204046.42772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204046.42775: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.42779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204046.42782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.42870: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204046.42930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204046.44881: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204046.44911: stderr chunk (state=3): >>><<< 9396 1727204046.44927: stdout chunk (state=3): >>><<< 9396 1727204046.44944: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204046.44999: _low_level_execute_command(): starting 9396 1727204046.45003: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203/AnsiballZ_stat.py && sleep 0' 9396 1727204046.45662: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204046.45757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204046.45773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.45837: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204046.45856: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204046.45880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204046.45983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204046.64001: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 9396 1727204046.65503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204046.65519: stderr chunk (state=3): >>>Shared connection to 10.31.11.210 closed. <<< 9396 1727204046.65536: stdout chunk (state=3): >>><<< 9396 1727204046.65696: stderr chunk (state=3): >>><<< 9396 1727204046.65700: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204046.65704: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204046.65707: _low_level_execute_command(): starting 9396 1727204046.65709: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204046.3608031-11270-130060459501203/ > /dev/null 2>&1 && sleep 0' 9396 1727204046.66292: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204046.66312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204046.66330: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204046.66352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204046.66381: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.66488: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 9396 1727204046.66514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204046.66534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204046.66621: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204046.68637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204046.68640: stdout chunk (state=3): >>><<< 9396 1727204046.68643: stderr chunk (state=3): >>><<< 9396 1727204046.68794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204046.68798: handler run complete 9396 1727204046.68800: attempt loop complete, returning result 9396 1727204046.68803: _execute() done 9396 1727204046.68805: dumping result to json 9396 1727204046.68809: done dumping result, returning 9396 1727204046.68812: done running TaskExecutor() for managed-node1/TASK: Stat profile file [12b410aa-8751-36c5-1f9e-0000000003b4] 9396 1727204046.68814: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003b4 9396 1727204046.68901: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003b4 ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 9396 1727204046.68978: no more pending results, returning what we have 9396 1727204046.68982: results queue empty 9396 1727204046.68984: checking for any_errors_fatal 9396 1727204046.68993: done checking for any_errors_fatal 9396 1727204046.68994: checking for max_fail_percentage 9396 1727204046.68996: done checking for max_fail_percentage 9396 1727204046.68997: checking to see if all hosts have failed and the running result is not ok 9396 1727204046.68998: done checking to see if all hosts have failed 9396 1727204046.68999: getting the remaining hosts for this loop 9396 1727204046.69196: done getting the remaining hosts for this loop 9396 1727204046.69202: getting the next task for host managed-node1 9396 1727204046.69212: done getting next task for host managed-node1 9396 1727204046.69216: ^ task is: TASK: Set NM profile exist flag based on the profile files 9396 1727204046.69221: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204046.69225: getting variables 9396 1727204046.69227: in VariableManager get_vars() 9396 1727204046.69275: Calling all_inventory to load vars for managed-node1 9396 1727204046.69279: Calling groups_inventory to load vars for managed-node1 9396 1727204046.69282: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204046.69303: WORKER PROCESS EXITING 9396 1727204046.69320: Calling all_plugins_play to load vars for managed-node1 9396 1727204046.69325: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204046.69329: Calling groups_plugins_play to load vars for managed-node1 9396 1727204046.71849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204046.74905: done with get_vars() 9396 1727204046.74942: done getting variables 9396 1727204046.75015: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.433) 0:00:22.721 ***** 9396 1727204046.75046: entering _queue_task() for managed-node1/set_fact 9396 1727204046.75376: worker is 1 (out of 1 available) 9396 1727204046.75393: exiting _queue_task() for managed-node1/set_fact 9396 1727204046.75407: done queuing things up, now waiting for results queue to drain 9396 1727204046.75409: waiting for pending results... 9396 1727204046.75694: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 9396 1727204046.75826: in run() - task 12b410aa-8751-36c5-1f9e-0000000003b5 9396 1727204046.75848: variable 'ansible_search_path' from source: unknown 9396 1727204046.75857: variable 'ansible_search_path' from source: unknown 9396 1727204046.75904: calling self._execute() 9396 1727204046.76004: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.76023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.76040: variable 'omit' from source: magic vars 9396 1727204046.76480: variable 'ansible_distribution_major_version' from source: facts 9396 1727204046.76502: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204046.76664: variable 'profile_stat' from source: set_fact 9396 1727204046.76691: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204046.76700: when evaluation is False, skipping this task 9396 1727204046.76709: _execute() done 9396 1727204046.76719: dumping result to json 9396 1727204046.76728: done dumping result, returning 9396 1727204046.76738: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-36c5-1f9e-0000000003b5] 9396 1727204046.76750: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003b5 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204046.76942: no more pending results, returning what we have 9396 1727204046.76947: results queue empty 9396 1727204046.76948: checking for any_errors_fatal 9396 1727204046.76960: done checking for any_errors_fatal 9396 1727204046.76961: checking for max_fail_percentage 9396 1727204046.76963: done checking for max_fail_percentage 9396 1727204046.76965: checking to see if all hosts have failed and the running result is not ok 9396 1727204046.76966: done checking to see if all hosts have failed 9396 1727204046.76967: getting the remaining hosts for this loop 9396 1727204046.76968: done getting the remaining hosts for this loop 9396 1727204046.76973: getting the next task for host managed-node1 9396 1727204046.76980: done getting next task for host managed-node1 9396 1727204046.76983: ^ task is: TASK: Get NM profile info 9396 1727204046.76991: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204046.76997: getting variables 9396 1727204046.76999: in VariableManager get_vars() 9396 1727204046.77045: Calling all_inventory to load vars for managed-node1 9396 1727204046.77048: Calling groups_inventory to load vars for managed-node1 9396 1727204046.77052: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204046.77068: Calling all_plugins_play to load vars for managed-node1 9396 1727204046.77073: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204046.77077: Calling groups_plugins_play to load vars for managed-node1 9396 1727204046.77706: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003b5 9396 1727204046.77710: WORKER PROCESS EXITING 9396 1727204046.79518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204046.82346: done with get_vars() 9396 1727204046.82382: done getting variables 9396 1727204046.82454: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:06 -0400 (0:00:00.074) 0:00:22.795 ***** 9396 1727204046.82493: entering _queue_task() for managed-node1/shell 9396 1727204046.82822: worker is 1 (out of 1 available) 9396 1727204046.82836: exiting _queue_task() for managed-node1/shell 9396 1727204046.82850: done queuing things up, now waiting for results queue to drain 9396 1727204046.82853: waiting for pending results... 9396 1727204046.83152: running TaskExecutor() for managed-node1/TASK: Get NM profile info 9396 1727204046.83299: in run() - task 12b410aa-8751-36c5-1f9e-0000000003b6 9396 1727204046.83327: variable 'ansible_search_path' from source: unknown 9396 1727204046.83336: variable 'ansible_search_path' from source: unknown 9396 1727204046.83379: calling self._execute() 9396 1727204046.83482: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.83499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.83517: variable 'omit' from source: magic vars 9396 1727204046.83959: variable 'ansible_distribution_major_version' from source: facts 9396 1727204046.83982: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204046.83996: variable 'omit' from source: magic vars 9396 1727204046.84062: variable 'omit' from source: magic vars 9396 1727204046.84192: variable 'profile' from source: include params 9396 1727204046.84204: variable 'item' from source: include params 9396 1727204046.84297: variable 'item' from source: include params 9396 1727204046.84315: variable 'omit' from source: magic vars 9396 1727204046.84494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204046.84498: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204046.84500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204046.84503: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204046.84505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204046.84514: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204046.84523: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.84531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.84662: Set connection var ansible_timeout to 10 9396 1727204046.84676: Set connection var ansible_shell_executable to /bin/sh 9396 1727204046.84693: Set connection var ansible_pipelining to False 9396 1727204046.84705: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204046.84716: Set connection var ansible_connection to ssh 9396 1727204046.84728: Set connection var ansible_shell_type to sh 9396 1727204046.84763: variable 'ansible_shell_executable' from source: unknown 9396 1727204046.84773: variable 'ansible_connection' from source: unknown 9396 1727204046.84781: variable 'ansible_module_compression' from source: unknown 9396 1727204046.84788: variable 'ansible_shell_type' from source: unknown 9396 1727204046.84797: variable 'ansible_shell_executable' from source: unknown 9396 1727204046.84805: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204046.84813: variable 'ansible_pipelining' from source: unknown 9396 1727204046.84821: variable 'ansible_timeout' from source: unknown 9396 1727204046.84829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204046.84998: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204046.85056: variable 'omit' from source: magic vars 9396 1727204046.85059: starting attempt loop 9396 1727204046.85062: running the handler 9396 1727204046.85065: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204046.85077: _low_level_execute_command(): starting 9396 1727204046.85094: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204046.85908: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.85996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204046.86032: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204046.86094: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204046.87834: stdout chunk (state=3): >>>/root <<< 9396 1727204046.88034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204046.88038: stdout chunk (state=3): >>><<< 9396 1727204046.88041: stderr chunk (state=3): >>><<< 9396 1727204046.88154: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204046.88158: _low_level_execute_command(): starting 9396 1727204046.88161: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095 `" && echo ansible-tmp-1727204046.880673-11290-32476512911095="` echo /root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095 `" ) && sleep 0' 9396 1727204046.88742: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204046.88758: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204046.88772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204046.88806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204046.88910: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.88915: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.88962: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204046.88981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204046.89008: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204046.89086: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204046.91188: stdout chunk (state=3): >>>ansible-tmp-1727204046.880673-11290-32476512911095=/root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095 <<< 9396 1727204046.91394: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204046.91398: stdout chunk (state=3): >>><<< 9396 1727204046.91400: stderr chunk (state=3): >>><<< 9396 1727204046.91421: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204046.880673-11290-32476512911095=/root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204046.91594: variable 'ansible_module_compression' from source: unknown 9396 1727204046.91597: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9396 1727204046.91600: variable 'ansible_facts' from source: unknown 9396 1727204046.91669: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095/AnsiballZ_command.py 9396 1727204046.91812: Sending initial data 9396 1727204046.91944: Sent initial data (153 bytes) 9396 1727204046.92599: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 9396 1727204046.92619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204046.92636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204046.92722: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204046.94437: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204046.94499: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204046.94565: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmp1y82uqoi /root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095/AnsiballZ_command.py <<< 9396 1727204046.94585: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095/AnsiballZ_command.py" <<< 9396 1727204046.94615: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmp1y82uqoi" to remote "/root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095/AnsiballZ_command.py" <<< 9396 1727204046.95907: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204046.95953: stderr chunk (state=3): >>><<< 9396 1727204046.95960: stdout chunk (state=3): >>><<< 9396 1727204046.96000: done transferring module to remote 9396 1727204046.96016: _low_level_execute_command(): starting 9396 1727204046.96022: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095/ /root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095/AnsiballZ_command.py && sleep 0' 9396 1727204046.96694: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204046.96894: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204046.96898: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204046.96901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.96904: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204046.96908: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.96916: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204046.96919: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204046.96921: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204046.96993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204046.98998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204046.99001: stdout chunk (state=3): >>><<< 9396 1727204046.99004: stderr chunk (state=3): >>><<< 9396 1727204046.99007: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204046.99014: _low_level_execute_command(): starting 9396 1727204046.99021: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095/AnsiballZ_command.py && sleep 0' 9396 1727204046.99705: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 9396 1727204046.99725: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204046.99778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204046.99840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204046.99876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204046.99900: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204046.99982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204047.20348: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:07.177386", "end": "2024-09-24 14:54:07.202412", "delta": "0:00:00.025026", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9396 1727204047.22297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204047.22301: stderr chunk (state=3): >>><<< 9396 1727204047.22304: stdout chunk (state=3): >>><<< 9396 1727204047.22306: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0 /etc/NetworkManager/system-connections/bond0.nmconnection \nbond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-24 14:54:07.177386", "end": "2024-09-24 14:54:07.202412", "delta": "0:00:00.025026", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204047.22310: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204047.22312: _low_level_execute_command(): starting 9396 1727204047.22315: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204046.880673-11290-32476512911095/ > /dev/null 2>&1 && sleep 0' 9396 1727204047.22961: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204047.22970: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204047.22982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204047.23003: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204047.23020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204047.23027: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204047.23038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204047.23061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204047.23070: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 9396 1727204047.23078: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 9396 1727204047.23087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204047.23172: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204047.23202: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204047.23218: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204047.23239: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204047.23314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204047.25495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204047.25499: stdout chunk (state=3): >>><<< 9396 1727204047.25503: stderr chunk (state=3): >>><<< 9396 1727204047.25506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204047.25509: handler run complete 9396 1727204047.25512: Evaluated conditional (False): False 9396 1727204047.25515: attempt loop complete, returning result 9396 1727204047.25546: _execute() done 9396 1727204047.25549: dumping result to json 9396 1727204047.25551: done dumping result, returning 9396 1727204047.25553: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [12b410aa-8751-36c5-1f9e-0000000003b6] 9396 1727204047.25554: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003b6 ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.025026", "end": "2024-09-24 14:54:07.202412", "rc": 0, "start": "2024-09-24 14:54:07.177386" } STDOUT: bond0 /etc/NetworkManager/system-connections/bond0.nmconnection bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 9396 1727204047.25734: no more pending results, returning what we have 9396 1727204047.25740: results queue empty 9396 1727204047.25741: checking for any_errors_fatal 9396 1727204047.25750: done checking for any_errors_fatal 9396 1727204047.25751: checking for max_fail_percentage 9396 1727204047.25753: done checking for max_fail_percentage 9396 1727204047.25754: checking to see if all hosts have failed and the running result is not ok 9396 1727204047.25756: done checking to see if all hosts have failed 9396 1727204047.25757: getting the remaining hosts for this loop 9396 1727204047.25759: done getting the remaining hosts for this loop 9396 1727204047.25764: getting the next task for host managed-node1 9396 1727204047.25773: done getting next task for host managed-node1 9396 1727204047.25777: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 9396 1727204047.25782: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204047.25786: getting variables 9396 1727204047.25997: in VariableManager get_vars() 9396 1727204047.26052: Calling all_inventory to load vars for managed-node1 9396 1727204047.26056: Calling groups_inventory to load vars for managed-node1 9396 1727204047.26059: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204047.26073: Calling all_plugins_play to load vars for managed-node1 9396 1727204047.26077: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204047.26082: Calling groups_plugins_play to load vars for managed-node1 9396 1727204047.26606: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003b6 9396 1727204047.26610: WORKER PROCESS EXITING 9396 1727204047.28498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204047.31331: done with get_vars() 9396 1727204047.31365: done getting variables 9396 1727204047.31439: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.489) 0:00:23.285 ***** 9396 1727204047.31475: entering _queue_task() for managed-node1/set_fact 9396 1727204047.31826: worker is 1 (out of 1 available) 9396 1727204047.31839: exiting _queue_task() for managed-node1/set_fact 9396 1727204047.31854: done queuing things up, now waiting for results queue to drain 9396 1727204047.31856: waiting for pending results... 9396 1727204047.32221: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 9396 1727204047.32326: in run() - task 12b410aa-8751-36c5-1f9e-0000000003b7 9396 1727204047.32351: variable 'ansible_search_path' from source: unknown 9396 1727204047.32356: variable 'ansible_search_path' from source: unknown 9396 1727204047.32396: calling self._execute() 9396 1727204047.32504: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.32515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.32646: variable 'omit' from source: magic vars 9396 1727204047.33077: variable 'ansible_distribution_major_version' from source: facts 9396 1727204047.33093: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204047.33307: variable 'nm_profile_exists' from source: set_fact 9396 1727204047.33334: Evaluated conditional (nm_profile_exists.rc == 0): True 9396 1727204047.33341: variable 'omit' from source: magic vars 9396 1727204047.33510: variable 'omit' from source: magic vars 9396 1727204047.33721: variable 'omit' from source: magic vars 9396 1727204047.33761: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204047.33803: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204047.33828: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204047.33855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204047.33866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204047.33908: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204047.33915: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.33920: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.34047: Set connection var ansible_timeout to 10 9396 1727204047.34071: Set connection var ansible_shell_executable to /bin/sh 9396 1727204047.34074: Set connection var ansible_pipelining to False 9396 1727204047.34076: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204047.34079: Set connection var ansible_connection to ssh 9396 1727204047.34081: Set connection var ansible_shell_type to sh 9396 1727204047.34181: variable 'ansible_shell_executable' from source: unknown 9396 1727204047.34184: variable 'ansible_connection' from source: unknown 9396 1727204047.34187: variable 'ansible_module_compression' from source: unknown 9396 1727204047.34191: variable 'ansible_shell_type' from source: unknown 9396 1727204047.34193: variable 'ansible_shell_executable' from source: unknown 9396 1727204047.34196: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.34198: variable 'ansible_pipelining' from source: unknown 9396 1727204047.34200: variable 'ansible_timeout' from source: unknown 9396 1727204047.34203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.34313: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204047.34327: variable 'omit' from source: magic vars 9396 1727204047.34341: starting attempt loop 9396 1727204047.34344: running the handler 9396 1727204047.34360: handler run complete 9396 1727204047.34396: attempt loop complete, returning result 9396 1727204047.34400: _execute() done 9396 1727204047.34402: dumping result to json 9396 1727204047.34405: done dumping result, returning 9396 1727204047.34407: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-36c5-1f9e-0000000003b7] 9396 1727204047.34410: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003b7 9396 1727204047.34494: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003b7 9396 1727204047.34497: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 9396 1727204047.34560: no more pending results, returning what we have 9396 1727204047.34564: results queue empty 9396 1727204047.34565: checking for any_errors_fatal 9396 1727204047.34576: done checking for any_errors_fatal 9396 1727204047.34577: checking for max_fail_percentage 9396 1727204047.34579: done checking for max_fail_percentage 9396 1727204047.34580: checking to see if all hosts have failed and the running result is not ok 9396 1727204047.34581: done checking to see if all hosts have failed 9396 1727204047.34582: getting the remaining hosts for this loop 9396 1727204047.34584: done getting the remaining hosts for this loop 9396 1727204047.34588: getting the next task for host managed-node1 9396 1727204047.34600: done getting next task for host managed-node1 9396 1727204047.34603: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 9396 1727204047.34608: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204047.34612: getting variables 9396 1727204047.34614: in VariableManager get_vars() 9396 1727204047.34651: Calling all_inventory to load vars for managed-node1 9396 1727204047.34653: Calling groups_inventory to load vars for managed-node1 9396 1727204047.34656: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204047.34668: Calling all_plugins_play to load vars for managed-node1 9396 1727204047.34671: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204047.34675: Calling groups_plugins_play to load vars for managed-node1 9396 1727204047.36939: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204047.40025: done with get_vars() 9396 1727204047.40061: done getting variables 9396 1727204047.40143: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204047.40329: variable 'profile' from source: include params 9396 1727204047.40339: variable 'item' from source: include params 9396 1727204047.40420: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.089) 0:00:23.375 ***** 9396 1727204047.40469: entering _queue_task() for managed-node1/command 9396 1727204047.40925: worker is 1 (out of 1 available) 9396 1727204047.40938: exiting _queue_task() for managed-node1/command 9396 1727204047.40951: done queuing things up, now waiting for results queue to drain 9396 1727204047.40953: waiting for pending results... 9396 1727204047.41307: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0 9396 1727204047.41344: in run() - task 12b410aa-8751-36c5-1f9e-0000000003b9 9396 1727204047.41397: variable 'ansible_search_path' from source: unknown 9396 1727204047.41402: variable 'ansible_search_path' from source: unknown 9396 1727204047.41406: calling self._execute() 9396 1727204047.41508: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.41516: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.41532: variable 'omit' from source: magic vars 9396 1727204047.41976: variable 'ansible_distribution_major_version' from source: facts 9396 1727204047.42003: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204047.42163: variable 'profile_stat' from source: set_fact 9396 1727204047.42179: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204047.42182: when evaluation is False, skipping this task 9396 1727204047.42185: _execute() done 9396 1727204047.42192: dumping result to json 9396 1727204047.42206: done dumping result, returning 9396 1727204047.42216: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0 [12b410aa-8751-36c5-1f9e-0000000003b9] 9396 1727204047.42223: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003b9 9396 1727204047.42330: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003b9 9396 1727204047.42334: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204047.42396: no more pending results, returning what we have 9396 1727204047.42402: results queue empty 9396 1727204047.42403: checking for any_errors_fatal 9396 1727204047.42415: done checking for any_errors_fatal 9396 1727204047.42416: checking for max_fail_percentage 9396 1727204047.42418: done checking for max_fail_percentage 9396 1727204047.42419: checking to see if all hosts have failed and the running result is not ok 9396 1727204047.42420: done checking to see if all hosts have failed 9396 1727204047.42421: getting the remaining hosts for this loop 9396 1727204047.42423: done getting the remaining hosts for this loop 9396 1727204047.42428: getting the next task for host managed-node1 9396 1727204047.42436: done getting next task for host managed-node1 9396 1727204047.42439: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 9396 1727204047.42444: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204047.42451: getting variables 9396 1727204047.42453: in VariableManager get_vars() 9396 1727204047.42701: Calling all_inventory to load vars for managed-node1 9396 1727204047.42705: Calling groups_inventory to load vars for managed-node1 9396 1727204047.42711: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204047.42722: Calling all_plugins_play to load vars for managed-node1 9396 1727204047.42726: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204047.42730: Calling groups_plugins_play to load vars for managed-node1 9396 1727204047.45124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204047.48146: done with get_vars() 9396 1727204047.48200: done getting variables 9396 1727204047.48284: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204047.48427: variable 'profile' from source: include params 9396 1727204047.48431: variable 'item' from source: include params 9396 1727204047.48511: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.080) 0:00:23.456 ***** 9396 1727204047.48548: entering _queue_task() for managed-node1/set_fact 9396 1727204047.48971: worker is 1 (out of 1 available) 9396 1727204047.48984: exiting _queue_task() for managed-node1/set_fact 9396 1727204047.49203: done queuing things up, now waiting for results queue to drain 9396 1727204047.49206: waiting for pending results... 9396 1727204047.49399: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 9396 1727204047.49494: in run() - task 12b410aa-8751-36c5-1f9e-0000000003ba 9396 1727204047.49501: variable 'ansible_search_path' from source: unknown 9396 1727204047.49504: variable 'ansible_search_path' from source: unknown 9396 1727204047.49599: calling self._execute() 9396 1727204047.49675: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.49684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.49705: variable 'omit' from source: magic vars 9396 1727204047.50187: variable 'ansible_distribution_major_version' from source: facts 9396 1727204047.50207: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204047.50384: variable 'profile_stat' from source: set_fact 9396 1727204047.50471: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204047.50475: when evaluation is False, skipping this task 9396 1727204047.50478: _execute() done 9396 1727204047.50480: dumping result to json 9396 1727204047.50483: done dumping result, returning 9396 1727204047.50485: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 [12b410aa-8751-36c5-1f9e-0000000003ba] 9396 1727204047.50487: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003ba 9396 1727204047.50559: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003ba 9396 1727204047.50562: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204047.50743: no more pending results, returning what we have 9396 1727204047.50748: results queue empty 9396 1727204047.50750: checking for any_errors_fatal 9396 1727204047.50757: done checking for any_errors_fatal 9396 1727204047.50758: checking for max_fail_percentage 9396 1727204047.50760: done checking for max_fail_percentage 9396 1727204047.50761: checking to see if all hosts have failed and the running result is not ok 9396 1727204047.50762: done checking to see if all hosts have failed 9396 1727204047.50763: getting the remaining hosts for this loop 9396 1727204047.50765: done getting the remaining hosts for this loop 9396 1727204047.50769: getting the next task for host managed-node1 9396 1727204047.50776: done getting next task for host managed-node1 9396 1727204047.50780: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 9396 1727204047.50785: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204047.50791: getting variables 9396 1727204047.50793: in VariableManager get_vars() 9396 1727204047.50842: Calling all_inventory to load vars for managed-node1 9396 1727204047.50846: Calling groups_inventory to load vars for managed-node1 9396 1727204047.50964: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204047.50977: Calling all_plugins_play to load vars for managed-node1 9396 1727204047.50981: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204047.50986: Calling groups_plugins_play to load vars for managed-node1 9396 1727204047.53388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204047.56486: done with get_vars() 9396 1727204047.56531: done getting variables 9396 1727204047.56609: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204047.56750: variable 'profile' from source: include params 9396 1727204047.56754: variable 'item' from source: include params 9396 1727204047.56832: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.083) 0:00:23.539 ***** 9396 1727204047.56874: entering _queue_task() for managed-node1/command 9396 1727204047.57245: worker is 1 (out of 1 available) 9396 1727204047.57259: exiting _queue_task() for managed-node1/command 9396 1727204047.57272: done queuing things up, now waiting for results queue to drain 9396 1727204047.57274: waiting for pending results... 9396 1727204047.57682: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0 9396 1727204047.57727: in run() - task 12b410aa-8751-36c5-1f9e-0000000003bb 9396 1727204047.57742: variable 'ansible_search_path' from source: unknown 9396 1727204047.57747: variable 'ansible_search_path' from source: unknown 9396 1727204047.57803: calling self._execute() 9396 1727204047.57930: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.57939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.57943: variable 'omit' from source: magic vars 9396 1727204047.58385: variable 'ansible_distribution_major_version' from source: facts 9396 1727204047.58444: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204047.58564: variable 'profile_stat' from source: set_fact 9396 1727204047.58580: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204047.58584: when evaluation is False, skipping this task 9396 1727204047.58596: _execute() done 9396 1727204047.58602: dumping result to json 9396 1727204047.58607: done dumping result, returning 9396 1727204047.58681: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0 [12b410aa-8751-36c5-1f9e-0000000003bb] 9396 1727204047.58684: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003bb 9396 1727204047.58750: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003bb 9396 1727204047.58753: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204047.58927: no more pending results, returning what we have 9396 1727204047.58932: results queue empty 9396 1727204047.58933: checking for any_errors_fatal 9396 1727204047.58940: done checking for any_errors_fatal 9396 1727204047.58941: checking for max_fail_percentage 9396 1727204047.58943: done checking for max_fail_percentage 9396 1727204047.58944: checking to see if all hosts have failed and the running result is not ok 9396 1727204047.58945: done checking to see if all hosts have failed 9396 1727204047.58946: getting the remaining hosts for this loop 9396 1727204047.58948: done getting the remaining hosts for this loop 9396 1727204047.58952: getting the next task for host managed-node1 9396 1727204047.58958: done getting next task for host managed-node1 9396 1727204047.58962: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 9396 1727204047.58967: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204047.58971: getting variables 9396 1727204047.58973: in VariableManager get_vars() 9396 1727204047.59014: Calling all_inventory to load vars for managed-node1 9396 1727204047.59018: Calling groups_inventory to load vars for managed-node1 9396 1727204047.59021: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204047.59033: Calling all_plugins_play to load vars for managed-node1 9396 1727204047.59037: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204047.59042: Calling groups_plugins_play to load vars for managed-node1 9396 1727204047.61334: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204047.64352: done with get_vars() 9396 1727204047.64385: done getting variables 9396 1727204047.64462: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204047.64602: variable 'profile' from source: include params 9396 1727204047.64609: variable 'item' from source: include params 9396 1727204047.64694: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.078) 0:00:23.618 ***** 9396 1727204047.64735: entering _queue_task() for managed-node1/set_fact 9396 1727204047.65100: worker is 1 (out of 1 available) 9396 1727204047.65119: exiting _queue_task() for managed-node1/set_fact 9396 1727204047.65134: done queuing things up, now waiting for results queue to drain 9396 1727204047.65136: waiting for pending results... 9396 1727204047.65508: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0 9396 1727204047.65529: in run() - task 12b410aa-8751-36c5-1f9e-0000000003bc 9396 1727204047.65554: variable 'ansible_search_path' from source: unknown 9396 1727204047.65562: variable 'ansible_search_path' from source: unknown 9396 1727204047.65607: calling self._execute() 9396 1727204047.65712: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.65727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.65746: variable 'omit' from source: magic vars 9396 1727204047.66134: variable 'ansible_distribution_major_version' from source: facts 9396 1727204047.66295: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204047.66316: variable 'profile_stat' from source: set_fact 9396 1727204047.66337: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204047.66347: when evaluation is False, skipping this task 9396 1727204047.66355: _execute() done 9396 1727204047.66364: dumping result to json 9396 1727204047.66372: done dumping result, returning 9396 1727204047.66383: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0 [12b410aa-8751-36c5-1f9e-0000000003bc] 9396 1727204047.66397: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003bc skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204047.66550: no more pending results, returning what we have 9396 1727204047.66555: results queue empty 9396 1727204047.66556: checking for any_errors_fatal 9396 1727204047.66560: done checking for any_errors_fatal 9396 1727204047.66561: checking for max_fail_percentage 9396 1727204047.66563: done checking for max_fail_percentage 9396 1727204047.66564: checking to see if all hosts have failed and the running result is not ok 9396 1727204047.66565: done checking to see if all hosts have failed 9396 1727204047.66566: getting the remaining hosts for this loop 9396 1727204047.66567: done getting the remaining hosts for this loop 9396 1727204047.66572: getting the next task for host managed-node1 9396 1727204047.66581: done getting next task for host managed-node1 9396 1727204047.66584: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 9396 1727204047.66588: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204047.66697: getting variables 9396 1727204047.66699: in VariableManager get_vars() 9396 1727204047.66740: Calling all_inventory to load vars for managed-node1 9396 1727204047.66743: Calling groups_inventory to load vars for managed-node1 9396 1727204047.66746: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204047.66758: Calling all_plugins_play to load vars for managed-node1 9396 1727204047.66761: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204047.66766: Calling groups_plugins_play to load vars for managed-node1 9396 1727204047.67391: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003bc 9396 1727204047.67395: WORKER PROCESS EXITING 9396 1727204047.69007: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204047.71941: done with get_vars() 9396 1727204047.71982: done getting variables 9396 1727204047.72057: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204047.72199: variable 'profile' from source: include params 9396 1727204047.72203: variable 'item' from source: include params 9396 1727204047.72276: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.075) 0:00:23.694 ***** 9396 1727204047.72316: entering _queue_task() for managed-node1/assert 9396 1727204047.72699: worker is 1 (out of 1 available) 9396 1727204047.72715: exiting _queue_task() for managed-node1/assert 9396 1727204047.72728: done queuing things up, now waiting for results queue to drain 9396 1727204047.72730: waiting for pending results... 9396 1727204047.72969: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0' 9396 1727204047.73106: in run() - task 12b410aa-8751-36c5-1f9e-000000000261 9396 1727204047.73132: variable 'ansible_search_path' from source: unknown 9396 1727204047.73144: variable 'ansible_search_path' from source: unknown 9396 1727204047.73284: calling self._execute() 9396 1727204047.73332: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.73347: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.73367: variable 'omit' from source: magic vars 9396 1727204047.73920: variable 'ansible_distribution_major_version' from source: facts 9396 1727204047.73952: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204047.73966: variable 'omit' from source: magic vars 9396 1727204047.74027: variable 'omit' from source: magic vars 9396 1727204047.74174: variable 'profile' from source: include params 9396 1727204047.74186: variable 'item' from source: include params 9396 1727204047.74287: variable 'item' from source: include params 9396 1727204047.74318: variable 'omit' from source: magic vars 9396 1727204047.74381: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204047.74431: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204047.74505: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204047.74529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204047.74999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204047.75002: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204047.75005: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.75008: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.75010: Set connection var ansible_timeout to 10 9396 1727204047.75013: Set connection var ansible_shell_executable to /bin/sh 9396 1727204047.75015: Set connection var ansible_pipelining to False 9396 1727204047.75017: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204047.75020: Set connection var ansible_connection to ssh 9396 1727204047.75022: Set connection var ansible_shell_type to sh 9396 1727204047.75024: variable 'ansible_shell_executable' from source: unknown 9396 1727204047.75026: variable 'ansible_connection' from source: unknown 9396 1727204047.75028: variable 'ansible_module_compression' from source: unknown 9396 1727204047.75031: variable 'ansible_shell_type' from source: unknown 9396 1727204047.75033: variable 'ansible_shell_executable' from source: unknown 9396 1727204047.75035: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.75037: variable 'ansible_pipelining' from source: unknown 9396 1727204047.75039: variable 'ansible_timeout' from source: unknown 9396 1727204047.75042: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.75080: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204047.75096: variable 'omit' from source: magic vars 9396 1727204047.75103: starting attempt loop 9396 1727204047.75106: running the handler 9396 1727204047.75249: variable 'lsr_net_profile_exists' from source: set_fact 9396 1727204047.75255: Evaluated conditional (lsr_net_profile_exists): True 9396 1727204047.75270: handler run complete 9396 1727204047.75291: attempt loop complete, returning result 9396 1727204047.75294: _execute() done 9396 1727204047.75297: dumping result to json 9396 1727204047.75302: done dumping result, returning 9396 1727204047.75313: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0' [12b410aa-8751-36c5-1f9e-000000000261] 9396 1727204047.75320: sending task result for task 12b410aa-8751-36c5-1f9e-000000000261 ok: [managed-node1] => { "changed": false } MSG: All assertions passed 9396 1727204047.75515: no more pending results, returning what we have 9396 1727204047.75520: results queue empty 9396 1727204047.75521: checking for any_errors_fatal 9396 1727204047.75529: done checking for any_errors_fatal 9396 1727204047.75530: checking for max_fail_percentage 9396 1727204047.75531: done checking for max_fail_percentage 9396 1727204047.75532: checking to see if all hosts have failed and the running result is not ok 9396 1727204047.75534: done checking to see if all hosts have failed 9396 1727204047.75535: getting the remaining hosts for this loop 9396 1727204047.75536: done getting the remaining hosts for this loop 9396 1727204047.75541: getting the next task for host managed-node1 9396 1727204047.75547: done getting next task for host managed-node1 9396 1727204047.75551: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 9396 1727204047.75554: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204047.75557: getting variables 9396 1727204047.75559: in VariableManager get_vars() 9396 1727204047.75604: Calling all_inventory to load vars for managed-node1 9396 1727204047.75607: Calling groups_inventory to load vars for managed-node1 9396 1727204047.75610: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204047.75622: Calling all_plugins_play to load vars for managed-node1 9396 1727204047.75625: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204047.75628: Calling groups_plugins_play to load vars for managed-node1 9396 1727204047.76288: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000261 9396 1727204047.76293: WORKER PROCESS EXITING 9396 1727204047.82646: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204047.85680: done with get_vars() 9396 1727204047.85721: done getting variables 9396 1727204047.85783: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204047.85911: variable 'profile' from source: include params 9396 1727204047.85914: variable 'item' from source: include params 9396 1727204047.85992: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.137) 0:00:23.831 ***** 9396 1727204047.86032: entering _queue_task() for managed-node1/assert 9396 1727204047.86400: worker is 1 (out of 1 available) 9396 1727204047.86423: exiting _queue_task() for managed-node1/assert 9396 1727204047.86438: done queuing things up, now waiting for results queue to drain 9396 1727204047.86440: waiting for pending results... 9396 1727204047.86768: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0' 9396 1727204047.86794: in run() - task 12b410aa-8751-36c5-1f9e-000000000262 9396 1727204047.86812: variable 'ansible_search_path' from source: unknown 9396 1727204047.86816: variable 'ansible_search_path' from source: unknown 9396 1727204047.86851: calling self._execute() 9396 1727204047.86951: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.86958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.86973: variable 'omit' from source: magic vars 9396 1727204047.87381: variable 'ansible_distribution_major_version' from source: facts 9396 1727204047.87394: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204047.87495: variable 'omit' from source: magic vars 9396 1727204047.87499: variable 'omit' from source: magic vars 9396 1727204047.87566: variable 'profile' from source: include params 9396 1727204047.87570: variable 'item' from source: include params 9396 1727204047.87648: variable 'item' from source: include params 9396 1727204047.87670: variable 'omit' from source: magic vars 9396 1727204047.87714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204047.87754: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204047.87776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204047.87799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204047.87814: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204047.87848: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204047.87852: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.87858: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.87979: Set connection var ansible_timeout to 10 9396 1727204047.88195: Set connection var ansible_shell_executable to /bin/sh 9396 1727204047.88199: Set connection var ansible_pipelining to False 9396 1727204047.88201: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204047.88204: Set connection var ansible_connection to ssh 9396 1727204047.88210: Set connection var ansible_shell_type to sh 9396 1727204047.88213: variable 'ansible_shell_executable' from source: unknown 9396 1727204047.88215: variable 'ansible_connection' from source: unknown 9396 1727204047.88218: variable 'ansible_module_compression' from source: unknown 9396 1727204047.88220: variable 'ansible_shell_type' from source: unknown 9396 1727204047.88223: variable 'ansible_shell_executable' from source: unknown 9396 1727204047.88226: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.88228: variable 'ansible_pipelining' from source: unknown 9396 1727204047.88234: variable 'ansible_timeout' from source: unknown 9396 1727204047.88237: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.88240: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204047.88243: variable 'omit' from source: magic vars 9396 1727204047.88251: starting attempt loop 9396 1727204047.88254: running the handler 9396 1727204047.88380: variable 'lsr_net_profile_ansible_managed' from source: set_fact 9396 1727204047.88386: Evaluated conditional (lsr_net_profile_ansible_managed): True 9396 1727204047.88398: handler run complete 9396 1727204047.88420: attempt loop complete, returning result 9396 1727204047.88423: _execute() done 9396 1727204047.88426: dumping result to json 9396 1727204047.88429: done dumping result, returning 9396 1727204047.88439: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0' [12b410aa-8751-36c5-1f9e-000000000262] 9396 1727204047.88445: sending task result for task 12b410aa-8751-36c5-1f9e-000000000262 9396 1727204047.88546: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000262 9396 1727204047.88548: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 9396 1727204047.88630: no more pending results, returning what we have 9396 1727204047.88634: results queue empty 9396 1727204047.88635: checking for any_errors_fatal 9396 1727204047.88641: done checking for any_errors_fatal 9396 1727204047.88641: checking for max_fail_percentage 9396 1727204047.88643: done checking for max_fail_percentage 9396 1727204047.88644: checking to see if all hosts have failed and the running result is not ok 9396 1727204047.88645: done checking to see if all hosts have failed 9396 1727204047.88646: getting the remaining hosts for this loop 9396 1727204047.88648: done getting the remaining hosts for this loop 9396 1727204047.88652: getting the next task for host managed-node1 9396 1727204047.88657: done getting next task for host managed-node1 9396 1727204047.88660: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 9396 1727204047.88663: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204047.88667: getting variables 9396 1727204047.88668: in VariableManager get_vars() 9396 1727204047.88714: Calling all_inventory to load vars for managed-node1 9396 1727204047.88717: Calling groups_inventory to load vars for managed-node1 9396 1727204047.88720: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204047.88730: Calling all_plugins_play to load vars for managed-node1 9396 1727204047.88733: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204047.88736: Calling groups_plugins_play to load vars for managed-node1 9396 1727204047.90918: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204047.93985: done with get_vars() 9396 1727204047.94031: done getting variables 9396 1727204047.94106: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204047.94256: variable 'profile' from source: include params 9396 1727204047.94260: variable 'item' from source: include params 9396 1727204047.94340: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:07 -0400 (0:00:00.083) 0:00:23.914 ***** 9396 1727204047.94387: entering _queue_task() for managed-node1/assert 9396 1727204047.94777: worker is 1 (out of 1 available) 9396 1727204047.94797: exiting _queue_task() for managed-node1/assert 9396 1727204047.94814: done queuing things up, now waiting for results queue to drain 9396 1727204047.94816: waiting for pending results... 9396 1727204047.95132: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0 9396 1727204047.95268: in run() - task 12b410aa-8751-36c5-1f9e-000000000263 9396 1727204047.95329: variable 'ansible_search_path' from source: unknown 9396 1727204047.95333: variable 'ansible_search_path' from source: unknown 9396 1727204047.95354: calling self._execute() 9396 1727204047.95478: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.95495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.95547: variable 'omit' from source: magic vars 9396 1727204047.95995: variable 'ansible_distribution_major_version' from source: facts 9396 1727204047.96018: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204047.96032: variable 'omit' from source: magic vars 9396 1727204047.96094: variable 'omit' from source: magic vars 9396 1727204047.96312: variable 'profile' from source: include params 9396 1727204047.96316: variable 'item' from source: include params 9396 1727204047.96325: variable 'item' from source: include params 9396 1727204047.96353: variable 'omit' from source: magic vars 9396 1727204047.96404: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204047.96459: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204047.96487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204047.96528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204047.96549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204047.96592: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204047.96603: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.96616: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.96761: Set connection var ansible_timeout to 10 9396 1727204047.96776: Set connection var ansible_shell_executable to /bin/sh 9396 1727204047.96855: Set connection var ansible_pipelining to False 9396 1727204047.96859: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204047.96861: Set connection var ansible_connection to ssh 9396 1727204047.96864: Set connection var ansible_shell_type to sh 9396 1727204047.96866: variable 'ansible_shell_executable' from source: unknown 9396 1727204047.96868: variable 'ansible_connection' from source: unknown 9396 1727204047.96878: variable 'ansible_module_compression' from source: unknown 9396 1727204047.96886: variable 'ansible_shell_type' from source: unknown 9396 1727204047.96896: variable 'ansible_shell_executable' from source: unknown 9396 1727204047.96905: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204047.96917: variable 'ansible_pipelining' from source: unknown 9396 1727204047.96926: variable 'ansible_timeout' from source: unknown 9396 1727204047.96936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204047.97124: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204047.97183: variable 'omit' from source: magic vars 9396 1727204047.97187: starting attempt loop 9396 1727204047.97192: running the handler 9396 1727204047.97317: variable 'lsr_net_profile_fingerprint' from source: set_fact 9396 1727204047.97330: Evaluated conditional (lsr_net_profile_fingerprint): True 9396 1727204047.97343: handler run complete 9396 1727204047.97367: attempt loop complete, returning result 9396 1727204047.97376: _execute() done 9396 1727204047.97399: dumping result to json 9396 1727204047.97402: done dumping result, returning 9396 1727204047.97512: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0 [12b410aa-8751-36c5-1f9e-000000000263] 9396 1727204047.97515: sending task result for task 12b410aa-8751-36c5-1f9e-000000000263 9396 1727204047.97583: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000263 9396 1727204047.97586: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 9396 1727204047.97668: no more pending results, returning what we have 9396 1727204047.97673: results queue empty 9396 1727204047.97674: checking for any_errors_fatal 9396 1727204047.97682: done checking for any_errors_fatal 9396 1727204047.97683: checking for max_fail_percentage 9396 1727204047.97685: done checking for max_fail_percentage 9396 1727204047.97686: checking to see if all hosts have failed and the running result is not ok 9396 1727204047.97688: done checking to see if all hosts have failed 9396 1727204047.97688: getting the remaining hosts for this loop 9396 1727204047.97692: done getting the remaining hosts for this loop 9396 1727204047.97697: getting the next task for host managed-node1 9396 1727204047.97710: done getting next task for host managed-node1 9396 1727204047.97714: ^ task is: TASK: Include the task 'get_profile_stat.yml' 9396 1727204047.97718: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204047.97723: getting variables 9396 1727204047.97725: in VariableManager get_vars() 9396 1727204047.97772: Calling all_inventory to load vars for managed-node1 9396 1727204047.97776: Calling groups_inventory to load vars for managed-node1 9396 1727204047.97780: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204047.97897: Calling all_plugins_play to load vars for managed-node1 9396 1727204047.97910: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204047.97916: Calling groups_plugins_play to load vars for managed-node1 9396 1727204048.00474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204048.03603: done with get_vars() 9396 1727204048.03644: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.093) 0:00:24.008 ***** 9396 1727204048.03779: entering _queue_task() for managed-node1/include_tasks 9396 1727204048.04174: worker is 1 (out of 1 available) 9396 1727204048.04194: exiting _queue_task() for managed-node1/include_tasks 9396 1727204048.04214: done queuing things up, now waiting for results queue to drain 9396 1727204048.04216: waiting for pending results... 9396 1727204048.04570: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 9396 1727204048.04796: in run() - task 12b410aa-8751-36c5-1f9e-000000000267 9396 1727204048.04800: variable 'ansible_search_path' from source: unknown 9396 1727204048.04803: variable 'ansible_search_path' from source: unknown 9396 1727204048.04817: calling self._execute() 9396 1727204048.04946: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204048.04967: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204048.04986: variable 'omit' from source: magic vars 9396 1727204048.05482: variable 'ansible_distribution_major_version' from source: facts 9396 1727204048.05514: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204048.05594: _execute() done 9396 1727204048.05598: dumping result to json 9396 1727204048.05603: done dumping result, returning 9396 1727204048.05606: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-36c5-1f9e-000000000267] 9396 1727204048.05611: sending task result for task 12b410aa-8751-36c5-1f9e-000000000267 9396 1727204048.05693: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000267 9396 1727204048.05696: WORKER PROCESS EXITING 9396 1727204048.05732: no more pending results, returning what we have 9396 1727204048.05738: in VariableManager get_vars() 9396 1727204048.05794: Calling all_inventory to load vars for managed-node1 9396 1727204048.05798: Calling groups_inventory to load vars for managed-node1 9396 1727204048.05801: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204048.05821: Calling all_plugins_play to load vars for managed-node1 9396 1727204048.05826: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204048.05831: Calling groups_plugins_play to load vars for managed-node1 9396 1727204048.08506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204048.11580: done with get_vars() 9396 1727204048.11618: variable 'ansible_search_path' from source: unknown 9396 1727204048.11620: variable 'ansible_search_path' from source: unknown 9396 1727204048.11672: we have included files to process 9396 1727204048.11674: generating all_blocks data 9396 1727204048.11676: done generating all_blocks data 9396 1727204048.11682: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 9396 1727204048.11684: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 9396 1727204048.11687: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 9396 1727204048.12986: done processing included file 9396 1727204048.12991: iterating over new_blocks loaded from include file 9396 1727204048.12994: in VariableManager get_vars() 9396 1727204048.13025: done with get_vars() 9396 1727204048.13028: filtering new block on tags 9396 1727204048.13068: done filtering new block on tags 9396 1727204048.13072: in VariableManager get_vars() 9396 1727204048.13101: done with get_vars() 9396 1727204048.13104: filtering new block on tags 9396 1727204048.13136: done filtering new block on tags 9396 1727204048.13140: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 9396 1727204048.13147: extending task lists for all hosts with included blocks 9396 1727204048.13425: done extending task lists 9396 1727204048.13427: done processing included files 9396 1727204048.13428: results queue empty 9396 1727204048.13429: checking for any_errors_fatal 9396 1727204048.13432: done checking for any_errors_fatal 9396 1727204048.13433: checking for max_fail_percentage 9396 1727204048.13435: done checking for max_fail_percentage 9396 1727204048.13436: checking to see if all hosts have failed and the running result is not ok 9396 1727204048.13437: done checking to see if all hosts have failed 9396 1727204048.13438: getting the remaining hosts for this loop 9396 1727204048.13439: done getting the remaining hosts for this loop 9396 1727204048.13443: getting the next task for host managed-node1 9396 1727204048.13447: done getting next task for host managed-node1 9396 1727204048.13450: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 9396 1727204048.13454: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204048.13457: getting variables 9396 1727204048.13458: in VariableManager get_vars() 9396 1727204048.13475: Calling all_inventory to load vars for managed-node1 9396 1727204048.13478: Calling groups_inventory to load vars for managed-node1 9396 1727204048.13486: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204048.13495: Calling all_plugins_play to load vars for managed-node1 9396 1727204048.13499: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204048.13503: Calling groups_plugins_play to load vars for managed-node1 9396 1727204048.17536: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204048.22203: done with get_vars() 9396 1727204048.22251: done getting variables 9396 1727204048.22410: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.186) 0:00:24.195 ***** 9396 1727204048.22450: entering _queue_task() for managed-node1/set_fact 9396 1727204048.23482: worker is 1 (out of 1 available) 9396 1727204048.23502: exiting _queue_task() for managed-node1/set_fact 9396 1727204048.23518: done queuing things up, now waiting for results queue to drain 9396 1727204048.23520: waiting for pending results... 9396 1727204048.24117: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 9396 1727204048.24124: in run() - task 12b410aa-8751-36c5-1f9e-0000000003fb 9396 1727204048.24127: variable 'ansible_search_path' from source: unknown 9396 1727204048.24133: variable 'ansible_search_path' from source: unknown 9396 1727204048.24179: calling self._execute() 9396 1727204048.24300: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204048.24326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204048.24344: variable 'omit' from source: magic vars 9396 1727204048.24863: variable 'ansible_distribution_major_version' from source: facts 9396 1727204048.24869: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204048.24872: variable 'omit' from source: magic vars 9396 1727204048.24929: variable 'omit' from source: magic vars 9396 1727204048.24995: variable 'omit' from source: magic vars 9396 1727204048.25051: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204048.25113: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204048.25191: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204048.25197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204048.25200: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204048.25239: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204048.25248: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204048.25256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204048.25391: Set connection var ansible_timeout to 10 9396 1727204048.25415: Set connection var ansible_shell_executable to /bin/sh 9396 1727204048.25519: Set connection var ansible_pipelining to False 9396 1727204048.25523: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204048.25525: Set connection var ansible_connection to ssh 9396 1727204048.25528: Set connection var ansible_shell_type to sh 9396 1727204048.25531: variable 'ansible_shell_executable' from source: unknown 9396 1727204048.25533: variable 'ansible_connection' from source: unknown 9396 1727204048.25536: variable 'ansible_module_compression' from source: unknown 9396 1727204048.25540: variable 'ansible_shell_type' from source: unknown 9396 1727204048.25542: variable 'ansible_shell_executable' from source: unknown 9396 1727204048.25545: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204048.25547: variable 'ansible_pipelining' from source: unknown 9396 1727204048.25549: variable 'ansible_timeout' from source: unknown 9396 1727204048.25558: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204048.25759: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204048.25780: variable 'omit' from source: magic vars 9396 1727204048.25786: starting attempt loop 9396 1727204048.25792: running the handler 9396 1727204048.25806: handler run complete 9396 1727204048.25820: attempt loop complete, returning result 9396 1727204048.25823: _execute() done 9396 1727204048.25826: dumping result to json 9396 1727204048.25829: done dumping result, returning 9396 1727204048.25837: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-36c5-1f9e-0000000003fb] 9396 1727204048.25895: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003fb 9396 1727204048.26179: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003fb 9396 1727204048.26183: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 9396 1727204048.26264: no more pending results, returning what we have 9396 1727204048.26267: results queue empty 9396 1727204048.26268: checking for any_errors_fatal 9396 1727204048.26270: done checking for any_errors_fatal 9396 1727204048.26271: checking for max_fail_percentage 9396 1727204048.26279: done checking for max_fail_percentage 9396 1727204048.26280: checking to see if all hosts have failed and the running result is not ok 9396 1727204048.26281: done checking to see if all hosts have failed 9396 1727204048.26282: getting the remaining hosts for this loop 9396 1727204048.26283: done getting the remaining hosts for this loop 9396 1727204048.26287: getting the next task for host managed-node1 9396 1727204048.26295: done getting next task for host managed-node1 9396 1727204048.26298: ^ task is: TASK: Stat profile file 9396 1727204048.26302: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204048.26305: getting variables 9396 1727204048.26309: in VariableManager get_vars() 9396 1727204048.26351: Calling all_inventory to load vars for managed-node1 9396 1727204048.26354: Calling groups_inventory to load vars for managed-node1 9396 1727204048.26357: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204048.26367: Calling all_plugins_play to load vars for managed-node1 9396 1727204048.26370: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204048.26373: Calling groups_plugins_play to load vars for managed-node1 9396 1727204048.29068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204048.32437: done with get_vars() 9396 1727204048.32481: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.101) 0:00:24.297 ***** 9396 1727204048.32620: entering _queue_task() for managed-node1/stat 9396 1727204048.33033: worker is 1 (out of 1 available) 9396 1727204048.33047: exiting _queue_task() for managed-node1/stat 9396 1727204048.33061: done queuing things up, now waiting for results queue to drain 9396 1727204048.33063: waiting for pending results... 9396 1727204048.33484: running TaskExecutor() for managed-node1/TASK: Stat profile file 9396 1727204048.33500: in run() - task 12b410aa-8751-36c5-1f9e-0000000003fc 9396 1727204048.33525: variable 'ansible_search_path' from source: unknown 9396 1727204048.33529: variable 'ansible_search_path' from source: unknown 9396 1727204048.33572: calling self._execute() 9396 1727204048.33686: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204048.33696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204048.33827: variable 'omit' from source: magic vars 9396 1727204048.34201: variable 'ansible_distribution_major_version' from source: facts 9396 1727204048.34257: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204048.34261: variable 'omit' from source: magic vars 9396 1727204048.34338: variable 'omit' from source: magic vars 9396 1727204048.34535: variable 'profile' from source: include params 9396 1727204048.34539: variable 'item' from source: include params 9396 1727204048.34541: variable 'item' from source: include params 9396 1727204048.34562: variable 'omit' from source: magic vars 9396 1727204048.34619: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204048.34898: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204048.34902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204048.34904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204048.34910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204048.34912: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204048.34914: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204048.34916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204048.35119: Set connection var ansible_timeout to 10 9396 1727204048.35128: Set connection var ansible_shell_executable to /bin/sh 9396 1727204048.35140: Set connection var ansible_pipelining to False 9396 1727204048.35148: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204048.35155: Set connection var ansible_connection to ssh 9396 1727204048.35159: Set connection var ansible_shell_type to sh 9396 1727204048.35192: variable 'ansible_shell_executable' from source: unknown 9396 1727204048.35196: variable 'ansible_connection' from source: unknown 9396 1727204048.35199: variable 'ansible_module_compression' from source: unknown 9396 1727204048.35202: variable 'ansible_shell_type' from source: unknown 9396 1727204048.35304: variable 'ansible_shell_executable' from source: unknown 9396 1727204048.35310: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204048.35404: variable 'ansible_pipelining' from source: unknown 9396 1727204048.35410: variable 'ansible_timeout' from source: unknown 9396 1727204048.35414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204048.35814: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204048.35826: variable 'omit' from source: magic vars 9396 1727204048.35835: starting attempt loop 9396 1727204048.35837: running the handler 9396 1727204048.35854: _low_level_execute_command(): starting 9396 1727204048.35983: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204048.36891: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204048.36967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.36998: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204048.37013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204048.37039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204048.37127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204048.38956: stdout chunk (state=3): >>>/root <<< 9396 1727204048.39377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204048.39380: stdout chunk (state=3): >>><<< 9396 1727204048.39383: stderr chunk (state=3): >>><<< 9396 1727204048.39388: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204048.39393: _low_level_execute_command(): starting 9396 1727204048.39396: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786 `" && echo ansible-tmp-1727204048.3925061-11338-87326214159786="` echo /root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786 `" ) && sleep 0' 9396 1727204048.40449: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204048.40456: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204048.40470: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204048.40477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.40512: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204048.40516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204048.40532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.40620: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204048.40638: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204048.40645: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204048.40725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204048.42937: stdout chunk (state=3): >>>ansible-tmp-1727204048.3925061-11338-87326214159786=/root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786 <<< 9396 1727204048.42945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204048.43102: stderr chunk (state=3): >>><<< 9396 1727204048.43106: stdout chunk (state=3): >>><<< 9396 1727204048.43155: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204048.3925061-11338-87326214159786=/root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204048.43186: variable 'ansible_module_compression' from source: unknown 9396 1727204048.43461: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9396 1727204048.43540: variable 'ansible_facts' from source: unknown 9396 1727204048.43615: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786/AnsiballZ_stat.py 9396 1727204048.43920: Sending initial data 9396 1727204048.43923: Sent initial data (151 bytes) 9396 1727204048.44713: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.44725: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204048.44735: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 9396 1727204048.44744: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 9396 1727204048.44754: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204048.44796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204048.44800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204048.44803: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204048.44806: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.44968: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204048.45010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204048.45273: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204048.47027: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204048.47092: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204048.47144: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmptjmg_dhm /root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786/AnsiballZ_stat.py <<< 9396 1727204048.47160: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786/AnsiballZ_stat.py" <<< 9396 1727204048.47192: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmptjmg_dhm" to remote "/root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786/AnsiballZ_stat.py" <<< 9396 1727204048.48799: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204048.48917: stderr chunk (state=3): >>><<< 9396 1727204048.48920: stdout chunk (state=3): >>><<< 9396 1727204048.48940: done transferring module to remote 9396 1727204048.48946: _low_level_execute_command(): starting 9396 1727204048.48973: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786/ /root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786/AnsiballZ_stat.py && sleep 0' 9396 1727204048.49763: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204048.49843: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204048.49854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204048.49858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204048.49861: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204048.49864: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204048.49867: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.49874: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204048.49877: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 9396 1727204048.49879: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 9396 1727204048.49882: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204048.49884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204048.49886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204048.49891: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204048.49902: stderr chunk (state=3): >>>debug2: match found <<< 9396 1727204048.49912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.49986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204048.50010: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204048.50013: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204048.50097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204048.52142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204048.52157: stderr chunk (state=3): >>><<< 9396 1727204048.52166: stdout chunk (state=3): >>><<< 9396 1727204048.52192: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204048.52202: _low_level_execute_command(): starting 9396 1727204048.52245: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786/AnsiballZ_stat.py && sleep 0' 9396 1727204048.53011: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.53079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204048.53100: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204048.53127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204048.53247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204048.71356: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 9396 1727204048.72859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204048.72917: stderr chunk (state=3): >>><<< 9396 1727204048.72921: stdout chunk (state=3): >>><<< 9396 1727204048.72936: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204048.72963: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204048.72973: _low_level_execute_command(): starting 9396 1727204048.72979: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204048.3925061-11338-87326214159786/ > /dev/null 2>&1 && sleep 0' 9396 1727204048.73439: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204048.73443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.73450: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204048.73452: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.73500: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204048.73505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204048.73552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204048.75537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204048.75578: stderr chunk (state=3): >>><<< 9396 1727204048.75581: stdout chunk (state=3): >>><<< 9396 1727204048.75602: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204048.75614: handler run complete 9396 1727204048.75632: attempt loop complete, returning result 9396 1727204048.75635: _execute() done 9396 1727204048.75638: dumping result to json 9396 1727204048.75643: done dumping result, returning 9396 1727204048.75651: done running TaskExecutor() for managed-node1/TASK: Stat profile file [12b410aa-8751-36c5-1f9e-0000000003fc] 9396 1727204048.75656: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003fc 9396 1727204048.75761: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003fc 9396 1727204048.75764: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 9396 1727204048.75838: no more pending results, returning what we have 9396 1727204048.75842: results queue empty 9396 1727204048.75843: checking for any_errors_fatal 9396 1727204048.75851: done checking for any_errors_fatal 9396 1727204048.75852: checking for max_fail_percentage 9396 1727204048.75854: done checking for max_fail_percentage 9396 1727204048.75855: checking to see if all hosts have failed and the running result is not ok 9396 1727204048.75856: done checking to see if all hosts have failed 9396 1727204048.75857: getting the remaining hosts for this loop 9396 1727204048.75859: done getting the remaining hosts for this loop 9396 1727204048.75863: getting the next task for host managed-node1 9396 1727204048.75869: done getting next task for host managed-node1 9396 1727204048.75874: ^ task is: TASK: Set NM profile exist flag based on the profile files 9396 1727204048.75879: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204048.75882: getting variables 9396 1727204048.75884: in VariableManager get_vars() 9396 1727204048.75932: Calling all_inventory to load vars for managed-node1 9396 1727204048.75936: Calling groups_inventory to load vars for managed-node1 9396 1727204048.75939: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204048.75951: Calling all_plugins_play to load vars for managed-node1 9396 1727204048.75954: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204048.75957: Calling groups_plugins_play to load vars for managed-node1 9396 1727204048.77288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204048.78861: done with get_vars() 9396 1727204048.78883: done getting variables 9396 1727204048.78942: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.463) 0:00:24.760 ***** 9396 1727204048.78968: entering _queue_task() for managed-node1/set_fact 9396 1727204048.79219: worker is 1 (out of 1 available) 9396 1727204048.79234: exiting _queue_task() for managed-node1/set_fact 9396 1727204048.79246: done queuing things up, now waiting for results queue to drain 9396 1727204048.79248: waiting for pending results... 9396 1727204048.79433: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 9396 1727204048.79524: in run() - task 12b410aa-8751-36c5-1f9e-0000000003fd 9396 1727204048.79537: variable 'ansible_search_path' from source: unknown 9396 1727204048.79542: variable 'ansible_search_path' from source: unknown 9396 1727204048.79581: calling self._execute() 9396 1727204048.79662: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204048.79669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204048.79680: variable 'omit' from source: magic vars 9396 1727204048.79991: variable 'ansible_distribution_major_version' from source: facts 9396 1727204048.80002: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204048.80113: variable 'profile_stat' from source: set_fact 9396 1727204048.80124: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204048.80129: when evaluation is False, skipping this task 9396 1727204048.80132: _execute() done 9396 1727204048.80135: dumping result to json 9396 1727204048.80139: done dumping result, returning 9396 1727204048.80150: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-36c5-1f9e-0000000003fd] 9396 1727204048.80153: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003fd 9396 1727204048.80246: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003fd 9396 1727204048.80258: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204048.80316: no more pending results, returning what we have 9396 1727204048.80320: results queue empty 9396 1727204048.80321: checking for any_errors_fatal 9396 1727204048.80330: done checking for any_errors_fatal 9396 1727204048.80331: checking for max_fail_percentage 9396 1727204048.80333: done checking for max_fail_percentage 9396 1727204048.80334: checking to see if all hosts have failed and the running result is not ok 9396 1727204048.80335: done checking to see if all hosts have failed 9396 1727204048.80336: getting the remaining hosts for this loop 9396 1727204048.80337: done getting the remaining hosts for this loop 9396 1727204048.80341: getting the next task for host managed-node1 9396 1727204048.80348: done getting next task for host managed-node1 9396 1727204048.80351: ^ task is: TASK: Get NM profile info 9396 1727204048.80355: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204048.80366: getting variables 9396 1727204048.80368: in VariableManager get_vars() 9396 1727204048.80405: Calling all_inventory to load vars for managed-node1 9396 1727204048.80410: Calling groups_inventory to load vars for managed-node1 9396 1727204048.80413: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204048.80424: Calling all_plugins_play to load vars for managed-node1 9396 1727204048.80427: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204048.80431: Calling groups_plugins_play to load vars for managed-node1 9396 1727204048.81601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204048.83163: done with get_vars() 9396 1727204048.83186: done getting variables 9396 1727204048.83238: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:08 -0400 (0:00:00.042) 0:00:24.803 ***** 9396 1727204048.83262: entering _queue_task() for managed-node1/shell 9396 1727204048.83499: worker is 1 (out of 1 available) 9396 1727204048.83515: exiting _queue_task() for managed-node1/shell 9396 1727204048.83528: done queuing things up, now waiting for results queue to drain 9396 1727204048.83529: waiting for pending results... 9396 1727204048.83698: running TaskExecutor() for managed-node1/TASK: Get NM profile info 9396 1727204048.83785: in run() - task 12b410aa-8751-36c5-1f9e-0000000003fe 9396 1727204048.83800: variable 'ansible_search_path' from source: unknown 9396 1727204048.83805: variable 'ansible_search_path' from source: unknown 9396 1727204048.83834: calling self._execute() 9396 1727204048.83916: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204048.83923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204048.83934: variable 'omit' from source: magic vars 9396 1727204048.84234: variable 'ansible_distribution_major_version' from source: facts 9396 1727204048.84245: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204048.84252: variable 'omit' from source: magic vars 9396 1727204048.84293: variable 'omit' from source: magic vars 9396 1727204048.84378: variable 'profile' from source: include params 9396 1727204048.84382: variable 'item' from source: include params 9396 1727204048.84442: variable 'item' from source: include params 9396 1727204048.84458: variable 'omit' from source: magic vars 9396 1727204048.84494: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204048.84528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204048.84547: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204048.84564: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204048.84575: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204048.84603: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204048.84606: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204048.84611: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204048.84694: Set connection var ansible_timeout to 10 9396 1727204048.84702: Set connection var ansible_shell_executable to /bin/sh 9396 1727204048.84712: Set connection var ansible_pipelining to False 9396 1727204048.84718: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204048.84724: Set connection var ansible_connection to ssh 9396 1727204048.84729: Set connection var ansible_shell_type to sh 9396 1727204048.84755: variable 'ansible_shell_executable' from source: unknown 9396 1727204048.84758: variable 'ansible_connection' from source: unknown 9396 1727204048.84761: variable 'ansible_module_compression' from source: unknown 9396 1727204048.84764: variable 'ansible_shell_type' from source: unknown 9396 1727204048.84766: variable 'ansible_shell_executable' from source: unknown 9396 1727204048.84769: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204048.84775: variable 'ansible_pipelining' from source: unknown 9396 1727204048.84778: variable 'ansible_timeout' from source: unknown 9396 1727204048.84783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204048.84902: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204048.84913: variable 'omit' from source: magic vars 9396 1727204048.84920: starting attempt loop 9396 1727204048.84923: running the handler 9396 1727204048.84933: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204048.84952: _low_level_execute_command(): starting 9396 1727204048.84958: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204048.85492: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204048.85496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.85499: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204048.85501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.85561: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204048.85565: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204048.85569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204048.85620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204048.87404: stdout chunk (state=3): >>>/root <<< 9396 1727204048.87511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204048.87567: stderr chunk (state=3): >>><<< 9396 1727204048.87570: stdout chunk (state=3): >>><<< 9396 1727204048.87597: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204048.87611: _low_level_execute_command(): starting 9396 1727204048.87615: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704 `" && echo ansible-tmp-1727204048.8759594-11356-174826160587704="` echo /root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704 `" ) && sleep 0' 9396 1727204048.88059: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204048.88071: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204048.88102: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.88106: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204048.88111: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.88167: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204048.88171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204048.88225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204048.90340: stdout chunk (state=3): >>>ansible-tmp-1727204048.8759594-11356-174826160587704=/root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704 <<< 9396 1727204048.90443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204048.90500: stderr chunk (state=3): >>><<< 9396 1727204048.90503: stdout chunk (state=3): >>><<< 9396 1727204048.90521: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204048.8759594-11356-174826160587704=/root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204048.90552: variable 'ansible_module_compression' from source: unknown 9396 1727204048.90599: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9396 1727204048.90638: variable 'ansible_facts' from source: unknown 9396 1727204048.90700: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704/AnsiballZ_command.py 9396 1727204048.90824: Sending initial data 9396 1727204048.90828: Sent initial data (155 bytes) 9396 1727204048.91299: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204048.91303: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.91306: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204048.91311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.91366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204048.91369: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204048.91432: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204048.93148: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 9396 1727204048.93153: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204048.93188: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204048.93253: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmp7tzlwuyp /root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704/AnsiballZ_command.py <<< 9396 1727204048.93258: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704/AnsiballZ_command.py" <<< 9396 1727204048.93316: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmp7tzlwuyp" to remote "/root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704/AnsiballZ_command.py" <<< 9396 1727204048.94124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204048.94180: stderr chunk (state=3): >>><<< 9396 1727204048.94184: stdout chunk (state=3): >>><<< 9396 1727204048.94205: done transferring module to remote 9396 1727204048.94219: _low_level_execute_command(): starting 9396 1727204048.94223: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704/ /root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704/AnsiballZ_command.py && sleep 0' 9396 1727204048.94669: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204048.94680: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.94706: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204048.94713: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.94766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204048.94769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204048.94819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204048.96768: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204048.96820: stderr chunk (state=3): >>><<< 9396 1727204048.96823: stdout chunk (state=3): >>><<< 9396 1727204048.96840: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204048.96844: _low_level_execute_command(): starting 9396 1727204048.96846: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704/AnsiballZ_command.py && sleep 0' 9396 1727204048.97284: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204048.97324: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204048.97328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.97330: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204048.97332: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204048.97335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204048.97387: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204048.97394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204048.97442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204049.17933: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:09.154735", "end": "2024-09-24 14:54:09.178352", "delta": "0:00:00.023617", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9396 1727204049.19998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204049.20002: stdout chunk (state=3): >>><<< 9396 1727204049.20005: stderr chunk (state=3): >>><<< 9396 1727204049.20010: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-24 14:54:09.154735", "end": "2024-09-24 14:54:09.178352", "delta": "0:00:00.023617", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204049.20013: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204049.20016: _low_level_execute_command(): starting 9396 1727204049.20018: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204048.8759594-11356-174826160587704/ > /dev/null 2>&1 && sleep 0' 9396 1727204049.20636: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204049.20650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204049.20665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204049.20800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204049.20827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204049.20912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204049.22896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204049.22995: stderr chunk (state=3): >>><<< 9396 1727204049.22998: stdout chunk (state=3): >>><<< 9396 1727204049.23020: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204049.23029: handler run complete 9396 1727204049.23144: Evaluated conditional (False): False 9396 1727204049.23148: attempt loop complete, returning result 9396 1727204049.23150: _execute() done 9396 1727204049.23152: dumping result to json 9396 1727204049.23154: done dumping result, returning 9396 1727204049.23157: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [12b410aa-8751-36c5-1f9e-0000000003fe] 9396 1727204049.23159: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003fe 9396 1727204049.23440: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003fe 9396 1727204049.23444: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.023617", "end": "2024-09-24 14:54:09.178352", "rc": 0, "start": "2024-09-24 14:54:09.154735" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 9396 1727204049.23627: no more pending results, returning what we have 9396 1727204049.23632: results queue empty 9396 1727204049.23633: checking for any_errors_fatal 9396 1727204049.23643: done checking for any_errors_fatal 9396 1727204049.23644: checking for max_fail_percentage 9396 1727204049.23646: done checking for max_fail_percentage 9396 1727204049.23647: checking to see if all hosts have failed and the running result is not ok 9396 1727204049.23649: done checking to see if all hosts have failed 9396 1727204049.23650: getting the remaining hosts for this loop 9396 1727204049.23651: done getting the remaining hosts for this loop 9396 1727204049.23656: getting the next task for host managed-node1 9396 1727204049.23665: done getting next task for host managed-node1 9396 1727204049.23668: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 9396 1727204049.23673: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204049.23678: getting variables 9396 1727204049.23680: in VariableManager get_vars() 9396 1727204049.24134: Calling all_inventory to load vars for managed-node1 9396 1727204049.24138: Calling groups_inventory to load vars for managed-node1 9396 1727204049.24141: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204049.24154: Calling all_plugins_play to load vars for managed-node1 9396 1727204049.24158: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204049.24163: Calling groups_plugins_play to load vars for managed-node1 9396 1727204049.27164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204049.30353: done with get_vars() 9396 1727204049.30392: done getting variables 9396 1727204049.30464: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.472) 0:00:25.275 ***** 9396 1727204049.30505: entering _queue_task() for managed-node1/set_fact 9396 1727204049.30855: worker is 1 (out of 1 available) 9396 1727204049.30870: exiting _queue_task() for managed-node1/set_fact 9396 1727204049.30882: done queuing things up, now waiting for results queue to drain 9396 1727204049.30883: waiting for pending results... 9396 1727204049.31312: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 9396 1727204049.31340: in run() - task 12b410aa-8751-36c5-1f9e-0000000003ff 9396 1727204049.31361: variable 'ansible_search_path' from source: unknown 9396 1727204049.31369: variable 'ansible_search_path' from source: unknown 9396 1727204049.31417: calling self._execute() 9396 1727204049.31528: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.31548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.31565: variable 'omit' from source: magic vars 9396 1727204049.32005: variable 'ansible_distribution_major_version' from source: facts 9396 1727204049.32080: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204049.32213: variable 'nm_profile_exists' from source: set_fact 9396 1727204049.32236: Evaluated conditional (nm_profile_exists.rc == 0): True 9396 1727204049.32247: variable 'omit' from source: magic vars 9396 1727204049.32320: variable 'omit' from source: magic vars 9396 1727204049.32365: variable 'omit' from source: magic vars 9396 1727204049.32423: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204049.32470: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204049.32515: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204049.32534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204049.32553: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204049.32624: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204049.32628: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.32631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.32748: Set connection var ansible_timeout to 10 9396 1727204049.32763: Set connection var ansible_shell_executable to /bin/sh 9396 1727204049.32779: Set connection var ansible_pipelining to False 9396 1727204049.32842: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204049.32845: Set connection var ansible_connection to ssh 9396 1727204049.32847: Set connection var ansible_shell_type to sh 9396 1727204049.32850: variable 'ansible_shell_executable' from source: unknown 9396 1727204049.32854: variable 'ansible_connection' from source: unknown 9396 1727204049.32863: variable 'ansible_module_compression' from source: unknown 9396 1727204049.32870: variable 'ansible_shell_type' from source: unknown 9396 1727204049.32877: variable 'ansible_shell_executable' from source: unknown 9396 1727204049.32886: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.32899: variable 'ansible_pipelining' from source: unknown 9396 1727204049.32911: variable 'ansible_timeout' from source: unknown 9396 1727204049.32922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.33103: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204049.33168: variable 'omit' from source: magic vars 9396 1727204049.33171: starting attempt loop 9396 1727204049.33173: running the handler 9396 1727204049.33174: handler run complete 9396 1727204049.33177: attempt loop complete, returning result 9396 1727204049.33178: _execute() done 9396 1727204049.33180: dumping result to json 9396 1727204049.33185: done dumping result, returning 9396 1727204049.33200: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-36c5-1f9e-0000000003ff] 9396 1727204049.33212: sending task result for task 12b410aa-8751-36c5-1f9e-0000000003ff 9396 1727204049.33441: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000003ff 9396 1727204049.33445: WORKER PROCESS EXITING ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 9396 1727204049.33521: no more pending results, returning what we have 9396 1727204049.33526: results queue empty 9396 1727204049.33528: checking for any_errors_fatal 9396 1727204049.33539: done checking for any_errors_fatal 9396 1727204049.33540: checking for max_fail_percentage 9396 1727204049.33542: done checking for max_fail_percentage 9396 1727204049.33544: checking to see if all hosts have failed and the running result is not ok 9396 1727204049.33545: done checking to see if all hosts have failed 9396 1727204049.33546: getting the remaining hosts for this loop 9396 1727204049.33548: done getting the remaining hosts for this loop 9396 1727204049.33553: getting the next task for host managed-node1 9396 1727204049.33565: done getting next task for host managed-node1 9396 1727204049.33570: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 9396 1727204049.33575: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204049.33582: getting variables 9396 1727204049.33584: in VariableManager get_vars() 9396 1727204049.33760: Calling all_inventory to load vars for managed-node1 9396 1727204049.33764: Calling groups_inventory to load vars for managed-node1 9396 1727204049.33767: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204049.33781: Calling all_plugins_play to load vars for managed-node1 9396 1727204049.33785: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204049.33897: Calling groups_plugins_play to load vars for managed-node1 9396 1727204049.36387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204049.39372: done with get_vars() 9396 1727204049.39417: done getting variables 9396 1727204049.39488: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204049.39633: variable 'profile' from source: include params 9396 1727204049.39637: variable 'item' from source: include params 9396 1727204049.39716: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.092) 0:00:25.368 ***** 9396 1727204049.39759: entering _queue_task() for managed-node1/command 9396 1727204049.40141: worker is 1 (out of 1 available) 9396 1727204049.40158: exiting _queue_task() for managed-node1/command 9396 1727204049.40172: done queuing things up, now waiting for results queue to drain 9396 1727204049.40174: waiting for pending results... 9396 1727204049.40485: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 9396 1727204049.40698: in run() - task 12b410aa-8751-36c5-1f9e-000000000401 9396 1727204049.40702: variable 'ansible_search_path' from source: unknown 9396 1727204049.40705: variable 'ansible_search_path' from source: unknown 9396 1727204049.40724: calling self._execute() 9396 1727204049.40840: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.40853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.40895: variable 'omit' from source: magic vars 9396 1727204049.41319: variable 'ansible_distribution_major_version' from source: facts 9396 1727204049.41345: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204049.41540: variable 'profile_stat' from source: set_fact 9396 1727204049.41594: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204049.41598: when evaluation is False, skipping this task 9396 1727204049.41601: _execute() done 9396 1727204049.41604: dumping result to json 9396 1727204049.41606: done dumping result, returning 9396 1727204049.41611: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [12b410aa-8751-36c5-1f9e-000000000401] 9396 1727204049.41622: sending task result for task 12b410aa-8751-36c5-1f9e-000000000401 9396 1727204049.41764: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000401 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204049.41957: no more pending results, returning what we have 9396 1727204049.41963: results queue empty 9396 1727204049.41965: checking for any_errors_fatal 9396 1727204049.41973: done checking for any_errors_fatal 9396 1727204049.41974: checking for max_fail_percentage 9396 1727204049.41976: done checking for max_fail_percentage 9396 1727204049.41977: checking to see if all hosts have failed and the running result is not ok 9396 1727204049.41978: done checking to see if all hosts have failed 9396 1727204049.41979: getting the remaining hosts for this loop 9396 1727204049.41981: done getting the remaining hosts for this loop 9396 1727204049.41987: getting the next task for host managed-node1 9396 1727204049.41998: done getting next task for host managed-node1 9396 1727204049.42002: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 9396 1727204049.42010: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204049.42016: getting variables 9396 1727204049.42018: in VariableManager get_vars() 9396 1727204049.42066: Calling all_inventory to load vars for managed-node1 9396 1727204049.42069: Calling groups_inventory to load vars for managed-node1 9396 1727204049.42073: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204049.42264: Calling all_plugins_play to load vars for managed-node1 9396 1727204049.42269: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204049.42284: WORKER PROCESS EXITING 9396 1727204049.42297: Calling groups_plugins_play to load vars for managed-node1 9396 1727204049.44537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204049.47693: done with get_vars() 9396 1727204049.47733: done getting variables 9396 1727204049.47811: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204049.47970: variable 'profile' from source: include params 9396 1727204049.47975: variable 'item' from source: include params 9396 1727204049.48053: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.083) 0:00:25.451 ***** 9396 1727204049.48091: entering _queue_task() for managed-node1/set_fact 9396 1727204049.48449: worker is 1 (out of 1 available) 9396 1727204049.48463: exiting _queue_task() for managed-node1/set_fact 9396 1727204049.48477: done queuing things up, now waiting for results queue to drain 9396 1727204049.48479: waiting for pending results... 9396 1727204049.48781: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 9396 1727204049.48930: in run() - task 12b410aa-8751-36c5-1f9e-000000000402 9396 1727204049.48952: variable 'ansible_search_path' from source: unknown 9396 1727204049.48960: variable 'ansible_search_path' from source: unknown 9396 1727204049.49010: calling self._execute() 9396 1727204049.49136: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.49153: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.49172: variable 'omit' from source: magic vars 9396 1727204049.49614: variable 'ansible_distribution_major_version' from source: facts 9396 1727204049.49635: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204049.49811: variable 'profile_stat' from source: set_fact 9396 1727204049.49833: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204049.49842: when evaluation is False, skipping this task 9396 1727204049.49850: _execute() done 9396 1727204049.49863: dumping result to json 9396 1727204049.49874: done dumping result, returning 9396 1727204049.49885: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [12b410aa-8751-36c5-1f9e-000000000402] 9396 1727204049.49899: sending task result for task 12b410aa-8751-36c5-1f9e-000000000402 9396 1727204049.50043: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000402 9396 1727204049.50047: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204049.50130: no more pending results, returning what we have 9396 1727204049.50136: results queue empty 9396 1727204049.50137: checking for any_errors_fatal 9396 1727204049.50144: done checking for any_errors_fatal 9396 1727204049.50145: checking for max_fail_percentage 9396 1727204049.50147: done checking for max_fail_percentage 9396 1727204049.50148: checking to see if all hosts have failed and the running result is not ok 9396 1727204049.50149: done checking to see if all hosts have failed 9396 1727204049.50151: getting the remaining hosts for this loop 9396 1727204049.50152: done getting the remaining hosts for this loop 9396 1727204049.50157: getting the next task for host managed-node1 9396 1727204049.50165: done getting next task for host managed-node1 9396 1727204049.50168: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 9396 1727204049.50173: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204049.50179: getting variables 9396 1727204049.50181: in VariableManager get_vars() 9396 1727204049.50232: Calling all_inventory to load vars for managed-node1 9396 1727204049.50235: Calling groups_inventory to load vars for managed-node1 9396 1727204049.50239: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204049.50256: Calling all_plugins_play to load vars for managed-node1 9396 1727204049.50260: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204049.50264: Calling groups_plugins_play to load vars for managed-node1 9396 1727204049.52827: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204049.55828: done with get_vars() 9396 1727204049.55871: done getting variables 9396 1727204049.55949: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204049.56086: variable 'profile' from source: include params 9396 1727204049.56092: variable 'item' from source: include params 9396 1727204049.56166: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.081) 0:00:25.533 ***** 9396 1727204049.56206: entering _queue_task() for managed-node1/command 9396 1727204049.56580: worker is 1 (out of 1 available) 9396 1727204049.56798: exiting _queue_task() for managed-node1/command 9396 1727204049.56812: done queuing things up, now waiting for results queue to drain 9396 1727204049.56814: waiting for pending results... 9396 1727204049.57012: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 9396 1727204049.57081: in run() - task 12b410aa-8751-36c5-1f9e-000000000403 9396 1727204049.57148: variable 'ansible_search_path' from source: unknown 9396 1727204049.57152: variable 'ansible_search_path' from source: unknown 9396 1727204049.57163: calling self._execute() 9396 1727204049.57278: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.57294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.57314: variable 'omit' from source: magic vars 9396 1727204049.57769: variable 'ansible_distribution_major_version' from source: facts 9396 1727204049.57796: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204049.58023: variable 'profile_stat' from source: set_fact 9396 1727204049.58028: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204049.58030: when evaluation is False, skipping this task 9396 1727204049.58032: _execute() done 9396 1727204049.58035: dumping result to json 9396 1727204049.58037: done dumping result, returning 9396 1727204049.58039: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 [12b410aa-8751-36c5-1f9e-000000000403] 9396 1727204049.58041: sending task result for task 12b410aa-8751-36c5-1f9e-000000000403 skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204049.58283: no more pending results, returning what we have 9396 1727204049.58291: results queue empty 9396 1727204049.58292: checking for any_errors_fatal 9396 1727204049.58303: done checking for any_errors_fatal 9396 1727204049.58304: checking for max_fail_percentage 9396 1727204049.58306: done checking for max_fail_percentage 9396 1727204049.58310: checking to see if all hosts have failed and the running result is not ok 9396 1727204049.58312: done checking to see if all hosts have failed 9396 1727204049.58312: getting the remaining hosts for this loop 9396 1727204049.58314: done getting the remaining hosts for this loop 9396 1727204049.58320: getting the next task for host managed-node1 9396 1727204049.58328: done getting next task for host managed-node1 9396 1727204049.58332: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 9396 1727204049.58337: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204049.58344: getting variables 9396 1727204049.58347: in VariableManager get_vars() 9396 1727204049.58672: Calling all_inventory to load vars for managed-node1 9396 1727204049.58675: Calling groups_inventory to load vars for managed-node1 9396 1727204049.58679: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204049.58696: Calling all_plugins_play to load vars for managed-node1 9396 1727204049.58700: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204049.58705: Calling groups_plugins_play to load vars for managed-node1 9396 1727204049.59316: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000403 9396 1727204049.59319: WORKER PROCESS EXITING 9396 1727204049.60595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204049.62548: done with get_vars() 9396 1727204049.62582: done getting variables 9396 1727204049.62659: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204049.62788: variable 'profile' from source: include params 9396 1727204049.62795: variable 'item' from source: include params 9396 1727204049.62873: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.067) 0:00:25.600 ***** 9396 1727204049.62907: entering _queue_task() for managed-node1/set_fact 9396 1727204049.63175: worker is 1 (out of 1 available) 9396 1727204049.63191: exiting _queue_task() for managed-node1/set_fact 9396 1727204049.63206: done queuing things up, now waiting for results queue to drain 9396 1727204049.63208: waiting for pending results... 9396 1727204049.63397: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 9396 1727204049.63491: in run() - task 12b410aa-8751-36c5-1f9e-000000000404 9396 1727204049.63504: variable 'ansible_search_path' from source: unknown 9396 1727204049.63507: variable 'ansible_search_path' from source: unknown 9396 1727204049.63543: calling self._execute() 9396 1727204049.63625: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.63631: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.63642: variable 'omit' from source: magic vars 9396 1727204049.63957: variable 'ansible_distribution_major_version' from source: facts 9396 1727204049.63968: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204049.64076: variable 'profile_stat' from source: set_fact 9396 1727204049.64092: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204049.64096: when evaluation is False, skipping this task 9396 1727204049.64099: _execute() done 9396 1727204049.64103: dumping result to json 9396 1727204049.64107: done dumping result, returning 9396 1727204049.64118: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [12b410aa-8751-36c5-1f9e-000000000404] 9396 1727204049.64124: sending task result for task 12b410aa-8751-36c5-1f9e-000000000404 9396 1727204049.64226: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000404 9396 1727204049.64229: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204049.64279: no more pending results, returning what we have 9396 1727204049.64283: results queue empty 9396 1727204049.64285: checking for any_errors_fatal 9396 1727204049.64293: done checking for any_errors_fatal 9396 1727204049.64294: checking for max_fail_percentage 9396 1727204049.64295: done checking for max_fail_percentage 9396 1727204049.64297: checking to see if all hosts have failed and the running result is not ok 9396 1727204049.64298: done checking to see if all hosts have failed 9396 1727204049.64299: getting the remaining hosts for this loop 9396 1727204049.64300: done getting the remaining hosts for this loop 9396 1727204049.64305: getting the next task for host managed-node1 9396 1727204049.64313: done getting next task for host managed-node1 9396 1727204049.64316: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 9396 1727204049.64320: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204049.64331: getting variables 9396 1727204049.64332: in VariableManager get_vars() 9396 1727204049.64370: Calling all_inventory to load vars for managed-node1 9396 1727204049.64373: Calling groups_inventory to load vars for managed-node1 9396 1727204049.64376: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204049.64386: Calling all_plugins_play to load vars for managed-node1 9396 1727204049.64391: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204049.64395: Calling groups_plugins_play to load vars for managed-node1 9396 1727204049.66171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204049.69270: done with get_vars() 9396 1727204049.69321: done getting variables 9396 1727204049.69401: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204049.69552: variable 'profile' from source: include params 9396 1727204049.69557: variable 'item' from source: include params 9396 1727204049.69631: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.067) 0:00:25.667 ***** 9396 1727204049.69674: entering _queue_task() for managed-node1/assert 9396 1727204049.70066: worker is 1 (out of 1 available) 9396 1727204049.70088: exiting _queue_task() for managed-node1/assert 9396 1727204049.70106: done queuing things up, now waiting for results queue to drain 9396 1727204049.70108: waiting for pending results... 9396 1727204049.70427: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.0' 9396 1727204049.70493: in run() - task 12b410aa-8751-36c5-1f9e-000000000268 9396 1727204049.70523: variable 'ansible_search_path' from source: unknown 9396 1727204049.70527: variable 'ansible_search_path' from source: unknown 9396 1727204049.70632: calling self._execute() 9396 1727204049.70660: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.70670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.70683: variable 'omit' from source: magic vars 9396 1727204049.71098: variable 'ansible_distribution_major_version' from source: facts 9396 1727204049.71115: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204049.71125: variable 'omit' from source: magic vars 9396 1727204049.71178: variable 'omit' from source: magic vars 9396 1727204049.71302: variable 'profile' from source: include params 9396 1727204049.71308: variable 'item' from source: include params 9396 1727204049.71391: variable 'item' from source: include params 9396 1727204049.71410: variable 'omit' from source: magic vars 9396 1727204049.71497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204049.71510: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204049.71532: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204049.71556: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204049.71571: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204049.71607: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204049.71619: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.71622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.71794: Set connection var ansible_timeout to 10 9396 1727204049.71798: Set connection var ansible_shell_executable to /bin/sh 9396 1727204049.71801: Set connection var ansible_pipelining to False 9396 1727204049.71803: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204049.71806: Set connection var ansible_connection to ssh 9396 1727204049.71808: Set connection var ansible_shell_type to sh 9396 1727204049.71813: variable 'ansible_shell_executable' from source: unknown 9396 1727204049.71820: variable 'ansible_connection' from source: unknown 9396 1727204049.71823: variable 'ansible_module_compression' from source: unknown 9396 1727204049.71826: variable 'ansible_shell_type' from source: unknown 9396 1727204049.71838: variable 'ansible_shell_executable' from source: unknown 9396 1727204049.71842: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.71845: variable 'ansible_pipelining' from source: unknown 9396 1727204049.71848: variable 'ansible_timeout' from source: unknown 9396 1727204049.71850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.72195: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204049.72199: variable 'omit' from source: magic vars 9396 1727204049.72202: starting attempt loop 9396 1727204049.72205: running the handler 9396 1727204049.72207: variable 'lsr_net_profile_exists' from source: set_fact 9396 1727204049.72210: Evaluated conditional (lsr_net_profile_exists): True 9396 1727204049.72212: handler run complete 9396 1727204049.72215: attempt loop complete, returning result 9396 1727204049.72217: _execute() done 9396 1727204049.72219: dumping result to json 9396 1727204049.72222: done dumping result, returning 9396 1727204049.72224: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.0' [12b410aa-8751-36c5-1f9e-000000000268] 9396 1727204049.72227: sending task result for task 12b410aa-8751-36c5-1f9e-000000000268 9396 1727204049.72321: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000268 9396 1727204049.72324: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 9396 1727204049.72394: no more pending results, returning what we have 9396 1727204049.72399: results queue empty 9396 1727204049.72400: checking for any_errors_fatal 9396 1727204049.72407: done checking for any_errors_fatal 9396 1727204049.72408: checking for max_fail_percentage 9396 1727204049.72410: done checking for max_fail_percentage 9396 1727204049.72412: checking to see if all hosts have failed and the running result is not ok 9396 1727204049.72413: done checking to see if all hosts have failed 9396 1727204049.72414: getting the remaining hosts for this loop 9396 1727204049.72415: done getting the remaining hosts for this loop 9396 1727204049.72420: getting the next task for host managed-node1 9396 1727204049.72425: done getting next task for host managed-node1 9396 1727204049.72428: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 9396 1727204049.72431: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204049.72436: getting variables 9396 1727204049.72437: in VariableManager get_vars() 9396 1727204049.72594: Calling all_inventory to load vars for managed-node1 9396 1727204049.72597: Calling groups_inventory to load vars for managed-node1 9396 1727204049.72601: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204049.72611: Calling all_plugins_play to load vars for managed-node1 9396 1727204049.72615: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204049.72619: Calling groups_plugins_play to load vars for managed-node1 9396 1727204049.75006: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204049.78138: done with get_vars() 9396 1727204049.78182: done getting variables 9396 1727204049.78271: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204049.78427: variable 'profile' from source: include params 9396 1727204049.78432: variable 'item' from source: include params 9396 1727204049.78519: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.088) 0:00:25.756 ***** 9396 1727204049.78572: entering _queue_task() for managed-node1/assert 9396 1727204049.78971: worker is 1 (out of 1 available) 9396 1727204049.78988: exiting _queue_task() for managed-node1/assert 9396 1727204049.79003: done queuing things up, now waiting for results queue to drain 9396 1727204049.79005: waiting for pending results... 9396 1727204049.79417: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' 9396 1727204049.79458: in run() - task 12b410aa-8751-36c5-1f9e-000000000269 9396 1727204049.79482: variable 'ansible_search_path' from source: unknown 9396 1727204049.79532: variable 'ansible_search_path' from source: unknown 9396 1727204049.79553: calling self._execute() 9396 1727204049.79684: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.79701: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.79719: variable 'omit' from source: magic vars 9396 1727204049.80191: variable 'ansible_distribution_major_version' from source: facts 9396 1727204049.80293: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204049.80300: variable 'omit' from source: magic vars 9396 1727204049.80303: variable 'omit' from source: magic vars 9396 1727204049.80420: variable 'profile' from source: include params 9396 1727204049.80432: variable 'item' from source: include params 9396 1727204049.80619: variable 'item' from source: include params 9396 1727204049.80623: variable 'omit' from source: magic vars 9396 1727204049.80626: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204049.80658: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204049.80685: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204049.80716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204049.80749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204049.80838: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204049.80841: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.80849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.80973: Set connection var ansible_timeout to 10 9396 1727204049.80987: Set connection var ansible_shell_executable to /bin/sh 9396 1727204049.81006: Set connection var ansible_pipelining to False 9396 1727204049.81018: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204049.81030: Set connection var ansible_connection to ssh 9396 1727204049.81037: Set connection var ansible_shell_type to sh 9396 1727204049.81093: variable 'ansible_shell_executable' from source: unknown 9396 1727204049.81097: variable 'ansible_connection' from source: unknown 9396 1727204049.81099: variable 'ansible_module_compression' from source: unknown 9396 1727204049.81164: variable 'ansible_shell_type' from source: unknown 9396 1727204049.81167: variable 'ansible_shell_executable' from source: unknown 9396 1727204049.81172: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.81177: variable 'ansible_pipelining' from source: unknown 9396 1727204049.81180: variable 'ansible_timeout' from source: unknown 9396 1727204049.81182: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.81330: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204049.81350: variable 'omit' from source: magic vars 9396 1727204049.81362: starting attempt loop 9396 1727204049.81369: running the handler 9396 1727204049.81525: variable 'lsr_net_profile_ansible_managed' from source: set_fact 9396 1727204049.81537: Evaluated conditional (lsr_net_profile_ansible_managed): True 9396 1727204049.81552: handler run complete 9396 1727204049.81599: attempt loop complete, returning result 9396 1727204049.81605: _execute() done 9396 1727204049.81611: dumping result to json 9396 1727204049.81615: done dumping result, returning 9396 1727204049.81696: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' [12b410aa-8751-36c5-1f9e-000000000269] 9396 1727204049.81699: sending task result for task 12b410aa-8751-36c5-1f9e-000000000269 9396 1727204049.81770: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000269 9396 1727204049.81796: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 9396 1727204049.81858: no more pending results, returning what we have 9396 1727204049.81863: results queue empty 9396 1727204049.81864: checking for any_errors_fatal 9396 1727204049.81873: done checking for any_errors_fatal 9396 1727204049.81874: checking for max_fail_percentage 9396 1727204049.81876: done checking for max_fail_percentage 9396 1727204049.81878: checking to see if all hosts have failed and the running result is not ok 9396 1727204049.81879: done checking to see if all hosts have failed 9396 1727204049.81880: getting the remaining hosts for this loop 9396 1727204049.81881: done getting the remaining hosts for this loop 9396 1727204049.81887: getting the next task for host managed-node1 9396 1727204049.81896: done getting next task for host managed-node1 9396 1727204049.81899: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 9396 1727204049.81903: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204049.81910: getting variables 9396 1727204049.81912: in VariableManager get_vars() 9396 1727204049.81960: Calling all_inventory to load vars for managed-node1 9396 1727204049.81964: Calling groups_inventory to load vars for managed-node1 9396 1727204049.81967: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204049.81981: Calling all_plugins_play to load vars for managed-node1 9396 1727204049.81985: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204049.82111: Calling groups_plugins_play to load vars for managed-node1 9396 1727204049.84621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204049.87747: done with get_vars() 9396 1727204049.87799: done getting variables 9396 1727204049.87878: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204049.88024: variable 'profile' from source: include params 9396 1727204049.88029: variable 'item' from source: include params 9396 1727204049.88110: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.095) 0:00:25.852 ***** 9396 1727204049.88150: entering _queue_task() for managed-node1/assert 9396 1727204049.88796: worker is 1 (out of 1 available) 9396 1727204049.88805: exiting _queue_task() for managed-node1/assert 9396 1727204049.88816: done queuing things up, now waiting for results queue to drain 9396 1727204049.88818: waiting for pending results... 9396 1727204049.88915: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.0 9396 1727204049.89097: in run() - task 12b410aa-8751-36c5-1f9e-00000000026a 9396 1727204049.89103: variable 'ansible_search_path' from source: unknown 9396 1727204049.89109: variable 'ansible_search_path' from source: unknown 9396 1727204049.89112: calling self._execute() 9396 1727204049.89182: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.89192: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.89204: variable 'omit' from source: magic vars 9396 1727204049.89666: variable 'ansible_distribution_major_version' from source: facts 9396 1727204049.89680: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204049.89691: variable 'omit' from source: magic vars 9396 1727204049.89753: variable 'omit' from source: magic vars 9396 1727204049.89890: variable 'profile' from source: include params 9396 1727204049.89895: variable 'item' from source: include params 9396 1727204049.89983: variable 'item' from source: include params 9396 1727204049.90095: variable 'omit' from source: magic vars 9396 1727204049.90100: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204049.90112: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204049.90142: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204049.90169: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204049.90183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204049.90221: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204049.90225: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.90229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.90378: Set connection var ansible_timeout to 10 9396 1727204049.90385: Set connection var ansible_shell_executable to /bin/sh 9396 1727204049.90398: Set connection var ansible_pipelining to False 9396 1727204049.90406: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204049.90413: Set connection var ansible_connection to ssh 9396 1727204049.90417: Set connection var ansible_shell_type to sh 9396 1727204049.90495: variable 'ansible_shell_executable' from source: unknown 9396 1727204049.90499: variable 'ansible_connection' from source: unknown 9396 1727204049.90502: variable 'ansible_module_compression' from source: unknown 9396 1727204049.90504: variable 'ansible_shell_type' from source: unknown 9396 1727204049.90509: variable 'ansible_shell_executable' from source: unknown 9396 1727204049.90512: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.90514: variable 'ansible_pipelining' from source: unknown 9396 1727204049.90517: variable 'ansible_timeout' from source: unknown 9396 1727204049.90519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.90670: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204049.90699: variable 'omit' from source: magic vars 9396 1727204049.90794: starting attempt loop 9396 1727204049.90798: running the handler 9396 1727204049.90855: variable 'lsr_net_profile_fingerprint' from source: set_fact 9396 1727204049.90861: Evaluated conditional (lsr_net_profile_fingerprint): True 9396 1727204049.90870: handler run complete 9396 1727204049.90895: attempt loop complete, returning result 9396 1727204049.90912: _execute() done 9396 1727204049.90916: dumping result to json 9396 1727204049.90919: done dumping result, returning 9396 1727204049.90927: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.0 [12b410aa-8751-36c5-1f9e-00000000026a] 9396 1727204049.90933: sending task result for task 12b410aa-8751-36c5-1f9e-00000000026a ok: [managed-node1] => { "changed": false } MSG: All assertions passed 9396 1727204049.91094: no more pending results, returning what we have 9396 1727204049.91098: results queue empty 9396 1727204049.91100: checking for any_errors_fatal 9396 1727204049.91113: done checking for any_errors_fatal 9396 1727204049.91114: checking for max_fail_percentage 9396 1727204049.91115: done checking for max_fail_percentage 9396 1727204049.91117: checking to see if all hosts have failed and the running result is not ok 9396 1727204049.91118: done checking to see if all hosts have failed 9396 1727204049.91119: getting the remaining hosts for this loop 9396 1727204049.91125: done getting the remaining hosts for this loop 9396 1727204049.91131: getting the next task for host managed-node1 9396 1727204049.91142: done getting next task for host managed-node1 9396 1727204049.91146: ^ task is: TASK: Include the task 'get_profile_stat.yml' 9396 1727204049.91150: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204049.91156: getting variables 9396 1727204049.91158: in VariableManager get_vars() 9396 1727204049.91212: Calling all_inventory to load vars for managed-node1 9396 1727204049.91217: Calling groups_inventory to load vars for managed-node1 9396 1727204049.91220: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204049.91312: Calling all_plugins_play to load vars for managed-node1 9396 1727204049.91317: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204049.91324: Calling groups_plugins_play to load vars for managed-node1 9396 1727204049.91858: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000026a 9396 1727204049.91862: WORKER PROCESS EXITING 9396 1727204049.93767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204049.96705: done with get_vars() 9396 1727204049.96765: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Tuesday 24 September 2024 14:54:09 -0400 (0:00:00.087) 0:00:25.939 ***** 9396 1727204049.96927: entering _queue_task() for managed-node1/include_tasks 9396 1727204049.97351: worker is 1 (out of 1 available) 9396 1727204049.97366: exiting _queue_task() for managed-node1/include_tasks 9396 1727204049.97381: done queuing things up, now waiting for results queue to drain 9396 1727204049.97383: waiting for pending results... 9396 1727204049.97713: running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' 9396 1727204049.97765: in run() - task 12b410aa-8751-36c5-1f9e-00000000026e 9396 1727204049.97799: variable 'ansible_search_path' from source: unknown 9396 1727204049.97803: variable 'ansible_search_path' from source: unknown 9396 1727204049.97848: calling self._execute() 9396 1727204049.97957: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204049.97962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204049.97975: variable 'omit' from source: magic vars 9396 1727204049.98311: variable 'ansible_distribution_major_version' from source: facts 9396 1727204049.98319: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204049.98329: _execute() done 9396 1727204049.98332: dumping result to json 9396 1727204049.98335: done dumping result, returning 9396 1727204049.98346: done running TaskExecutor() for managed-node1/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-36c5-1f9e-00000000026e] 9396 1727204049.98352: sending task result for task 12b410aa-8751-36c5-1f9e-00000000026e 9396 1727204049.98457: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000026e 9396 1727204049.98460: WORKER PROCESS EXITING 9396 1727204049.98494: no more pending results, returning what we have 9396 1727204049.98500: in VariableManager get_vars() 9396 1727204049.98552: Calling all_inventory to load vars for managed-node1 9396 1727204049.98556: Calling groups_inventory to load vars for managed-node1 9396 1727204049.98559: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204049.98576: Calling all_plugins_play to load vars for managed-node1 9396 1727204049.98580: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204049.98584: Calling groups_plugins_play to load vars for managed-node1 9396 1727204050.03549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204050.07075: done with get_vars() 9396 1727204050.07112: variable 'ansible_search_path' from source: unknown 9396 1727204050.07113: variable 'ansible_search_path' from source: unknown 9396 1727204050.07145: we have included files to process 9396 1727204050.07146: generating all_blocks data 9396 1727204050.07147: done generating all_blocks data 9396 1727204050.07149: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 9396 1727204050.07150: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 9396 1727204050.07151: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 9396 1727204050.07916: done processing included file 9396 1727204050.07918: iterating over new_blocks loaded from include file 9396 1727204050.07919: in VariableManager get_vars() 9396 1727204050.07937: done with get_vars() 9396 1727204050.07940: filtering new block on tags 9396 1727204050.07961: done filtering new block on tags 9396 1727204050.07964: in VariableManager get_vars() 9396 1727204050.07980: done with get_vars() 9396 1727204050.07982: filtering new block on tags 9396 1727204050.08001: done filtering new block on tags 9396 1727204050.08002: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node1 9396 1727204050.08006: extending task lists for all hosts with included blocks 9396 1727204050.08146: done extending task lists 9396 1727204050.08147: done processing included files 9396 1727204050.08148: results queue empty 9396 1727204050.08148: checking for any_errors_fatal 9396 1727204050.08151: done checking for any_errors_fatal 9396 1727204050.08152: checking for max_fail_percentage 9396 1727204050.08153: done checking for max_fail_percentage 9396 1727204050.08154: checking to see if all hosts have failed and the running result is not ok 9396 1727204050.08154: done checking to see if all hosts have failed 9396 1727204050.08155: getting the remaining hosts for this loop 9396 1727204050.08156: done getting the remaining hosts for this loop 9396 1727204050.08158: getting the next task for host managed-node1 9396 1727204050.08160: done getting next task for host managed-node1 9396 1727204050.08162: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 9396 1727204050.08164: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204050.08166: getting variables 9396 1727204050.08167: in VariableManager get_vars() 9396 1727204050.08179: Calling all_inventory to load vars for managed-node1 9396 1727204050.08181: Calling groups_inventory to load vars for managed-node1 9396 1727204050.08182: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204050.08188: Calling all_plugins_play to load vars for managed-node1 9396 1727204050.08192: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204050.08195: Calling groups_plugins_play to load vars for managed-node1 9396 1727204050.10488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204050.13413: done with get_vars() 9396 1727204050.13456: done getting variables 9396 1727204050.13521: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.166) 0:00:26.106 ***** 9396 1727204050.13559: entering _queue_task() for managed-node1/set_fact 9396 1727204050.13944: worker is 1 (out of 1 available) 9396 1727204050.13957: exiting _queue_task() for managed-node1/set_fact 9396 1727204050.13971: done queuing things up, now waiting for results queue to drain 9396 1727204050.13972: waiting for pending results... 9396 1727204050.14320: running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag 9396 1727204050.14393: in run() - task 12b410aa-8751-36c5-1f9e-000000000443 9396 1727204050.14595: variable 'ansible_search_path' from source: unknown 9396 1727204050.14599: variable 'ansible_search_path' from source: unknown 9396 1727204050.14604: calling self._execute() 9396 1727204050.14609: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204050.14613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204050.14616: variable 'omit' from source: magic vars 9396 1727204050.15023: variable 'ansible_distribution_major_version' from source: facts 9396 1727204050.15042: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204050.15049: variable 'omit' from source: magic vars 9396 1727204050.15113: variable 'omit' from source: magic vars 9396 1727204050.15168: variable 'omit' from source: magic vars 9396 1727204050.15211: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204050.15251: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204050.15279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204050.15304: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204050.15318: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204050.15354: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204050.15358: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204050.15368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204050.15594: Set connection var ansible_timeout to 10 9396 1727204050.15599: Set connection var ansible_shell_executable to /bin/sh 9396 1727204050.15602: Set connection var ansible_pipelining to False 9396 1727204050.15605: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204050.15610: Set connection var ansible_connection to ssh 9396 1727204050.15613: Set connection var ansible_shell_type to sh 9396 1727204050.15616: variable 'ansible_shell_executable' from source: unknown 9396 1727204050.15619: variable 'ansible_connection' from source: unknown 9396 1727204050.15622: variable 'ansible_module_compression' from source: unknown 9396 1727204050.15624: variable 'ansible_shell_type' from source: unknown 9396 1727204050.15627: variable 'ansible_shell_executable' from source: unknown 9396 1727204050.15629: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204050.15631: variable 'ansible_pipelining' from source: unknown 9396 1727204050.15633: variable 'ansible_timeout' from source: unknown 9396 1727204050.15635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204050.15896: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204050.15901: variable 'omit' from source: magic vars 9396 1727204050.15904: starting attempt loop 9396 1727204050.15909: running the handler 9396 1727204050.15912: handler run complete 9396 1727204050.15970: attempt loop complete, returning result 9396 1727204050.15974: _execute() done 9396 1727204050.15977: dumping result to json 9396 1727204050.15980: done dumping result, returning 9396 1727204050.15982: done running TaskExecutor() for managed-node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-36c5-1f9e-000000000443] 9396 1727204050.15984: sending task result for task 12b410aa-8751-36c5-1f9e-000000000443 ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 9396 1727204050.16278: no more pending results, returning what we have 9396 1727204050.16283: results queue empty 9396 1727204050.16285: checking for any_errors_fatal 9396 1727204050.16287: done checking for any_errors_fatal 9396 1727204050.16291: checking for max_fail_percentage 9396 1727204050.16293: done checking for max_fail_percentage 9396 1727204050.16294: checking to see if all hosts have failed and the running result is not ok 9396 1727204050.16296: done checking to see if all hosts have failed 9396 1727204050.16297: getting the remaining hosts for this loop 9396 1727204050.16298: done getting the remaining hosts for this loop 9396 1727204050.16310: getting the next task for host managed-node1 9396 1727204050.16319: done getting next task for host managed-node1 9396 1727204050.16323: ^ task is: TASK: Stat profile file 9396 1727204050.16328: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204050.16333: getting variables 9396 1727204050.16336: in VariableManager get_vars() 9396 1727204050.16388: Calling all_inventory to load vars for managed-node1 9396 1727204050.16498: Calling groups_inventory to load vars for managed-node1 9396 1727204050.16502: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204050.16513: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000443 9396 1727204050.16516: WORKER PROCESS EXITING 9396 1727204050.16533: Calling all_plugins_play to load vars for managed-node1 9396 1727204050.16538: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204050.16542: Calling groups_plugins_play to load vars for managed-node1 9396 1727204050.18849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204050.20568: done with get_vars() 9396 1727204050.20599: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.071) 0:00:26.177 ***** 9396 1727204050.20684: entering _queue_task() for managed-node1/stat 9396 1727204050.20994: worker is 1 (out of 1 available) 9396 1727204050.21012: exiting _queue_task() for managed-node1/stat 9396 1727204050.21026: done queuing things up, now waiting for results queue to drain 9396 1727204050.21028: waiting for pending results... 9396 1727204050.21287: running TaskExecutor() for managed-node1/TASK: Stat profile file 9396 1727204050.21450: in run() - task 12b410aa-8751-36c5-1f9e-000000000444 9396 1727204050.21477: variable 'ansible_search_path' from source: unknown 9396 1727204050.21485: variable 'ansible_search_path' from source: unknown 9396 1727204050.21551: calling self._execute() 9396 1727204050.21680: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204050.21700: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204050.21723: variable 'omit' from source: magic vars 9396 1727204050.22238: variable 'ansible_distribution_major_version' from source: facts 9396 1727204050.22263: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204050.22267: variable 'omit' from source: magic vars 9396 1727204050.22329: variable 'omit' from source: magic vars 9396 1727204050.22425: variable 'profile' from source: include params 9396 1727204050.22429: variable 'item' from source: include params 9396 1727204050.22485: variable 'item' from source: include params 9396 1727204050.22509: variable 'omit' from source: magic vars 9396 1727204050.22548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204050.22579: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204050.22601: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204050.22627: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204050.22638: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204050.22666: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204050.22670: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204050.22674: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204050.22767: Set connection var ansible_timeout to 10 9396 1727204050.22774: Set connection var ansible_shell_executable to /bin/sh 9396 1727204050.22782: Set connection var ansible_pipelining to False 9396 1727204050.22790: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204050.22797: Set connection var ansible_connection to ssh 9396 1727204050.22800: Set connection var ansible_shell_type to sh 9396 1727204050.22827: variable 'ansible_shell_executable' from source: unknown 9396 1727204050.22832: variable 'ansible_connection' from source: unknown 9396 1727204050.22835: variable 'ansible_module_compression' from source: unknown 9396 1727204050.22837: variable 'ansible_shell_type' from source: unknown 9396 1727204050.22843: variable 'ansible_shell_executable' from source: unknown 9396 1727204050.22845: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204050.22850: variable 'ansible_pipelining' from source: unknown 9396 1727204050.22854: variable 'ansible_timeout' from source: unknown 9396 1727204050.22859: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204050.23039: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204050.23049: variable 'omit' from source: magic vars 9396 1727204050.23057: starting attempt loop 9396 1727204050.23060: running the handler 9396 1727204050.23075: _low_level_execute_command(): starting 9396 1727204050.23082: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204050.23642: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204050.23648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204050.23652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.23696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204050.23717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204050.23720: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204050.23765: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204050.25531: stdout chunk (state=3): >>>/root <<< 9396 1727204050.25688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204050.25697: stdout chunk (state=3): >>><<< 9396 1727204050.25707: stderr chunk (state=3): >>><<< 9396 1727204050.25736: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204050.25750: _low_level_execute_command(): starting 9396 1727204050.25757: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657 `" && echo ansible-tmp-1727204050.257365-11396-168156349423657="` echo /root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657 `" ) && sleep 0' 9396 1727204050.26201: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204050.26231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.26241: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204050.26244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.26297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204050.26301: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204050.26351: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204050.28423: stdout chunk (state=3): >>>ansible-tmp-1727204050.257365-11396-168156349423657=/root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657 <<< 9396 1727204050.28634: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204050.28638: stdout chunk (state=3): >>><<< 9396 1727204050.28640: stderr chunk (state=3): >>><<< 9396 1727204050.28750: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204050.257365-11396-168156349423657=/root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204050.28753: variable 'ansible_module_compression' from source: unknown 9396 1727204050.28792: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 9396 1727204050.28827: variable 'ansible_facts' from source: unknown 9396 1727204050.28884: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657/AnsiballZ_stat.py 9396 1727204050.28992: Sending initial data 9396 1727204050.28997: Sent initial data (151 bytes) 9396 1727204050.29449: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204050.29453: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.29456: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204050.29460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.29513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204050.29516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204050.29562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204050.31239: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204050.31296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204050.31305: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmp0mon1weh /root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657/AnsiballZ_stat.py <<< 9396 1727204050.31315: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657/AnsiballZ_stat.py" <<< 9396 1727204050.31370: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmp0mon1weh" to remote "/root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657/AnsiballZ_stat.py" <<< 9396 1727204050.33296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204050.33300: stderr chunk (state=3): >>><<< 9396 1727204050.33303: stdout chunk (state=3): >>><<< 9396 1727204050.33306: done transferring module to remote 9396 1727204050.33308: _low_level_execute_command(): starting 9396 1727204050.33310: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657/ /root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657/AnsiballZ_stat.py && sleep 0' 9396 1727204050.33939: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204050.34009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.34073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204050.34086: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204050.34106: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204050.34205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204050.36093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204050.36149: stderr chunk (state=3): >>><<< 9396 1727204050.36153: stdout chunk (state=3): >>><<< 9396 1727204050.36169: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204050.36172: _low_level_execute_command(): starting 9396 1727204050.36179: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657/AnsiballZ_stat.py && sleep 0' 9396 1727204050.36644: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204050.36650: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204050.36653: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.36655: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204050.36658: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.36707: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204050.36711: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204050.36762: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204050.54304: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 9396 1727204050.55997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204050.56002: stdout chunk (state=3): >>><<< 9396 1727204050.56004: stderr chunk (state=3): >>><<< 9396 1727204050.56009: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204050.56012: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204050.56014: _low_level_execute_command(): starting 9396 1727204050.56016: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204050.257365-11396-168156349423657/ > /dev/null 2>&1 && sleep 0' 9396 1727204050.56573: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204050.56597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204050.56603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204050.56710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.56720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204050.56737: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204050.56763: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204050.56934: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204050.58840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204050.58844: stdout chunk (state=3): >>><<< 9396 1727204050.58852: stderr chunk (state=3): >>><<< 9396 1727204050.58994: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204050.58998: handler run complete 9396 1727204050.59000: attempt loop complete, returning result 9396 1727204050.59003: _execute() done 9396 1727204050.59005: dumping result to json 9396 1727204050.59010: done dumping result, returning 9396 1727204050.59013: done running TaskExecutor() for managed-node1/TASK: Stat profile file [12b410aa-8751-36c5-1f9e-000000000444] 9396 1727204050.59015: sending task result for task 12b410aa-8751-36c5-1f9e-000000000444 9396 1727204050.59092: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000444 ok: [managed-node1] => { "changed": false, "stat": { "exists": false } } 9396 1727204050.59170: no more pending results, returning what we have 9396 1727204050.59175: results queue empty 9396 1727204050.59177: checking for any_errors_fatal 9396 1727204050.59187: done checking for any_errors_fatal 9396 1727204050.59188: checking for max_fail_percentage 9396 1727204050.59190: done checking for max_fail_percentage 9396 1727204050.59443: checking to see if all hosts have failed and the running result is not ok 9396 1727204050.59446: done checking to see if all hosts have failed 9396 1727204050.59447: getting the remaining hosts for this loop 9396 1727204050.59448: done getting the remaining hosts for this loop 9396 1727204050.59453: getting the next task for host managed-node1 9396 1727204050.59461: done getting next task for host managed-node1 9396 1727204050.59464: ^ task is: TASK: Set NM profile exist flag based on the profile files 9396 1727204050.59468: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204050.59472: getting variables 9396 1727204050.59474: in VariableManager get_vars() 9396 1727204050.59520: Calling all_inventory to load vars for managed-node1 9396 1727204050.59523: Calling groups_inventory to load vars for managed-node1 9396 1727204050.59526: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204050.59540: Calling all_plugins_play to load vars for managed-node1 9396 1727204050.59544: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204050.59548: Calling groups_plugins_play to load vars for managed-node1 9396 1727204050.60625: WORKER PROCESS EXITING 9396 1727204050.63100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204050.66086: done with get_vars() 9396 1727204050.66129: done getting variables 9396 1727204050.66195: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.455) 0:00:26.633 ***** 9396 1727204050.66231: entering _queue_task() for managed-node1/set_fact 9396 1727204050.66828: worker is 1 (out of 1 available) 9396 1727204050.66843: exiting _queue_task() for managed-node1/set_fact 9396 1727204050.66857: done queuing things up, now waiting for results queue to drain 9396 1727204050.66859: waiting for pending results... 9396 1727204050.67423: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files 9396 1727204050.67682: in run() - task 12b410aa-8751-36c5-1f9e-000000000445 9396 1727204050.67700: variable 'ansible_search_path' from source: unknown 9396 1727204050.67704: variable 'ansible_search_path' from source: unknown 9396 1727204050.67746: calling self._execute() 9396 1727204050.67966: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204050.68204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204050.68218: variable 'omit' from source: magic vars 9396 1727204050.69088: variable 'ansible_distribution_major_version' from source: facts 9396 1727204050.69106: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204050.69377: variable 'profile_stat' from source: set_fact 9396 1727204050.69499: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204050.69503: when evaluation is False, skipping this task 9396 1727204050.69505: _execute() done 9396 1727204050.69511: dumping result to json 9396 1727204050.69521: done dumping result, returning 9396 1727204050.69528: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-36c5-1f9e-000000000445] 9396 1727204050.69537: sending task result for task 12b410aa-8751-36c5-1f9e-000000000445 9396 1727204050.69767: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000445 9396 1727204050.69770: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204050.69832: no more pending results, returning what we have 9396 1727204050.69837: results queue empty 9396 1727204050.69839: checking for any_errors_fatal 9396 1727204050.69849: done checking for any_errors_fatal 9396 1727204050.69850: checking for max_fail_percentage 9396 1727204050.69852: done checking for max_fail_percentage 9396 1727204050.69853: checking to see if all hosts have failed and the running result is not ok 9396 1727204050.69854: done checking to see if all hosts have failed 9396 1727204050.69855: getting the remaining hosts for this loop 9396 1727204050.69857: done getting the remaining hosts for this loop 9396 1727204050.69861: getting the next task for host managed-node1 9396 1727204050.69870: done getting next task for host managed-node1 9396 1727204050.69874: ^ task is: TASK: Get NM profile info 9396 1727204050.69878: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204050.69883: getting variables 9396 1727204050.69885: in VariableManager get_vars() 9396 1727204050.69931: Calling all_inventory to load vars for managed-node1 9396 1727204050.69934: Calling groups_inventory to load vars for managed-node1 9396 1727204050.69937: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204050.69950: Calling all_plugins_play to load vars for managed-node1 9396 1727204050.69954: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204050.69958: Calling groups_plugins_play to load vars for managed-node1 9396 1727204050.72662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204050.76197: done with get_vars() 9396 1727204050.76237: done getting variables 9396 1727204050.76312: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:54:10 -0400 (0:00:00.101) 0:00:26.734 ***** 9396 1727204050.76350: entering _queue_task() for managed-node1/shell 9396 1727204050.76687: worker is 1 (out of 1 available) 9396 1727204050.76702: exiting _queue_task() for managed-node1/shell 9396 1727204050.76716: done queuing things up, now waiting for results queue to drain 9396 1727204050.76718: waiting for pending results... 9396 1727204050.77113: running TaskExecutor() for managed-node1/TASK: Get NM profile info 9396 1727204050.77172: in run() - task 12b410aa-8751-36c5-1f9e-000000000446 9396 1727204050.77199: variable 'ansible_search_path' from source: unknown 9396 1727204050.77214: variable 'ansible_search_path' from source: unknown 9396 1727204050.77255: calling self._execute() 9396 1727204050.77374: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204050.77387: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204050.77424: variable 'omit' from source: magic vars 9396 1727204050.77877: variable 'ansible_distribution_major_version' from source: facts 9396 1727204050.77965: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204050.77969: variable 'omit' from source: magic vars 9396 1727204050.77975: variable 'omit' from source: magic vars 9396 1727204050.78114: variable 'profile' from source: include params 9396 1727204050.78126: variable 'item' from source: include params 9396 1727204050.78214: variable 'item' from source: include params 9396 1727204050.78242: variable 'omit' from source: magic vars 9396 1727204050.78294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204050.78344: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204050.78371: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204050.78401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204050.78424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204050.78462: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204050.78494: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204050.78497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204050.78617: Set connection var ansible_timeout to 10 9396 1727204050.78694: Set connection var ansible_shell_executable to /bin/sh 9396 1727204050.78698: Set connection var ansible_pipelining to False 9396 1727204050.78700: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204050.78703: Set connection var ansible_connection to ssh 9396 1727204050.78705: Set connection var ansible_shell_type to sh 9396 1727204050.78710: variable 'ansible_shell_executable' from source: unknown 9396 1727204050.78715: variable 'ansible_connection' from source: unknown 9396 1727204050.78728: variable 'ansible_module_compression' from source: unknown 9396 1727204050.78735: variable 'ansible_shell_type' from source: unknown 9396 1727204050.78743: variable 'ansible_shell_executable' from source: unknown 9396 1727204050.78751: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204050.78759: variable 'ansible_pipelining' from source: unknown 9396 1727204050.78767: variable 'ansible_timeout' from source: unknown 9396 1727204050.78775: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204050.78958: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204050.78976: variable 'omit' from source: magic vars 9396 1727204050.78987: starting attempt loop 9396 1727204050.78997: running the handler 9396 1727204050.79051: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204050.79055: _low_level_execute_command(): starting 9396 1727204050.79057: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204050.79928: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.79959: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204050.79976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204050.80047: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204050.80127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204050.82258: stdout chunk (state=3): >>>/root <<< 9396 1727204050.82262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204050.82264: stdout chunk (state=3): >>><<< 9396 1727204050.82267: stderr chunk (state=3): >>><<< 9396 1727204050.82596: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204050.82601: _low_level_execute_command(): starting 9396 1727204050.82604: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143 `" && echo ansible-tmp-1727204050.8221478-11425-43986557889143="` echo /root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143 `" ) && sleep 0' 9396 1727204050.82828: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204050.82837: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204050.82847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204050.82865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204050.82878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204050.82899: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204050.82902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.82917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204050.82932: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 9396 1727204050.83054: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 9396 1727204050.83062: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204050.83065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204050.83068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204050.83070: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204050.83072: stderr chunk (state=3): >>>debug2: match found <<< 9396 1727204050.83075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.83077: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204050.83105: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204050.83115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204050.83214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204050.85234: stdout chunk (state=3): >>>ansible-tmp-1727204050.8221478-11425-43986557889143=/root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143 <<< 9396 1727204050.85440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204050.85443: stdout chunk (state=3): >>><<< 9396 1727204050.85446: stderr chunk (state=3): >>><<< 9396 1727204050.85700: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204050.8221478-11425-43986557889143=/root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204050.85704: variable 'ansible_module_compression' from source: unknown 9396 1727204050.85709: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9396 1727204050.85711: variable 'ansible_facts' from source: unknown 9396 1727204050.85905: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143/AnsiballZ_command.py 9396 1727204050.86323: Sending initial data 9396 1727204050.86327: Sent initial data (154 bytes) 9396 1727204050.87195: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.87225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204050.87244: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204050.87248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204050.87413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204050.89057: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 9396 1727204050.89095: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 9396 1727204050.89099: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 9396 1727204050.89102: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 9396 1727204050.89104: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 9396 1727204050.89109: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 9396 1727204050.89112: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 9396 1727204050.89115: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 9396 1727204050.89124: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 9396 1727204050.89146: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204050.89201: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204050.89282: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpfrztnpm8 /root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143/AnsiballZ_command.py <<< 9396 1727204050.89287: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143/AnsiballZ_command.py" <<< 9396 1727204050.89355: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpfrztnpm8" to remote "/root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143/AnsiballZ_command.py" <<< 9396 1727204050.91023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204050.91027: stderr chunk (state=3): >>><<< 9396 1727204050.91030: stdout chunk (state=3): >>><<< 9396 1727204050.91036: done transferring module to remote 9396 1727204050.91038: _low_level_execute_command(): starting 9396 1727204050.91041: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143/ /root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143/AnsiballZ_command.py && sleep 0' 9396 1727204050.91864: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204050.91874: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204050.91886: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204050.91906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204050.91922: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204050.91931: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204050.91947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.91962: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204050.92061: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204050.92077: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204050.92149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204050.94208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204050.94225: stderr chunk (state=3): >>><<< 9396 1727204050.94238: stdout chunk (state=3): >>><<< 9396 1727204050.94263: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204050.94360: _low_level_execute_command(): starting 9396 1727204050.94364: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143/AnsiballZ_command.py && sleep 0' 9396 1727204050.94898: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204050.94913: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204050.94935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204050.94952: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204050.95013: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204050.95075: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204050.95100: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204050.95229: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204051.15414: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:11.129527", "end": "2024-09-24 14:54:11.153282", "delta": "0:00:00.023755", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9396 1727204051.17565: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204051.17569: stdout chunk (state=3): >>><<< 9396 1727204051.17572: stderr chunk (state=3): >>><<< 9396 1727204051.17580: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-24 14:54:11.129527", "end": "2024-09-24 14:54:11.153282", "delta": "0:00:00.023755", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204051.17592: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204051.17594: _low_level_execute_command(): starting 9396 1727204051.17597: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204050.8221478-11425-43986557889143/ > /dev/null 2>&1 && sleep 0' 9396 1727204051.18758: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204051.18875: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204051.18891: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 9396 1727204051.19011: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204051.19064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204051.19110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204051.19286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204051.21310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204051.21338: stderr chunk (state=3): >>><<< 9396 1727204051.21358: stdout chunk (state=3): >>><<< 9396 1727204051.21383: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204051.21402: handler run complete 9396 1727204051.21443: Evaluated conditional (False): False 9396 1727204051.21472: attempt loop complete, returning result 9396 1727204051.21481: _execute() done 9396 1727204051.21490: dumping result to json 9396 1727204051.21502: done dumping result, returning 9396 1727204051.21524: done running TaskExecutor() for managed-node1/TASK: Get NM profile info [12b410aa-8751-36c5-1f9e-000000000446] 9396 1727204051.21536: sending task result for task 12b410aa-8751-36c5-1f9e-000000000446 9396 1727204051.21897: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000446 9396 1727204051.21901: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.023755", "end": "2024-09-24 14:54:11.153282", "rc": 0, "start": "2024-09-24 14:54:11.129527" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 9396 1727204051.22004: no more pending results, returning what we have 9396 1727204051.22015: results queue empty 9396 1727204051.22017: checking for any_errors_fatal 9396 1727204051.22023: done checking for any_errors_fatal 9396 1727204051.22024: checking for max_fail_percentage 9396 1727204051.22026: done checking for max_fail_percentage 9396 1727204051.22027: checking to see if all hosts have failed and the running result is not ok 9396 1727204051.22028: done checking to see if all hosts have failed 9396 1727204051.22029: getting the remaining hosts for this loop 9396 1727204051.22031: done getting the remaining hosts for this loop 9396 1727204051.22036: getting the next task for host managed-node1 9396 1727204051.22043: done getting next task for host managed-node1 9396 1727204051.22047: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 9396 1727204051.22051: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204051.22055: getting variables 9396 1727204051.22057: in VariableManager get_vars() 9396 1727204051.22117: Calling all_inventory to load vars for managed-node1 9396 1727204051.22195: Calling groups_inventory to load vars for managed-node1 9396 1727204051.22204: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204051.22222: Calling all_plugins_play to load vars for managed-node1 9396 1727204051.22226: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204051.22231: Calling groups_plugins_play to load vars for managed-node1 9396 1727204051.25757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204051.29035: done with get_vars() 9396 1727204051.29078: done getting variables 9396 1727204051.29458: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.531) 0:00:27.265 ***** 9396 1727204051.29504: entering _queue_task() for managed-node1/set_fact 9396 1727204051.30268: worker is 1 (out of 1 available) 9396 1727204051.30283: exiting _queue_task() for managed-node1/set_fact 9396 1727204051.30305: done queuing things up, now waiting for results queue to drain 9396 1727204051.30309: waiting for pending results... 9396 1727204051.30711: running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 9396 1727204051.30788: in run() - task 12b410aa-8751-36c5-1f9e-000000000447 9396 1727204051.30820: variable 'ansible_search_path' from source: unknown 9396 1727204051.30846: variable 'ansible_search_path' from source: unknown 9396 1727204051.30955: calling self._execute() 9396 1727204051.31021: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.31035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.31063: variable 'omit' from source: magic vars 9396 1727204051.31550: variable 'ansible_distribution_major_version' from source: facts 9396 1727204051.31570: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204051.31766: variable 'nm_profile_exists' from source: set_fact 9396 1727204051.31791: Evaluated conditional (nm_profile_exists.rc == 0): True 9396 1727204051.31810: variable 'omit' from source: magic vars 9396 1727204051.31884: variable 'omit' from source: magic vars 9396 1727204051.31994: variable 'omit' from source: magic vars 9396 1727204051.31998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204051.32053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204051.32083: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204051.32116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204051.32172: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204051.32228: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204051.32237: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.32246: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.32404: Set connection var ansible_timeout to 10 9396 1727204051.32495: Set connection var ansible_shell_executable to /bin/sh 9396 1727204051.32511: Set connection var ansible_pipelining to False 9396 1727204051.32514: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204051.32516: Set connection var ansible_connection to ssh 9396 1727204051.32519: Set connection var ansible_shell_type to sh 9396 1727204051.32522: variable 'ansible_shell_executable' from source: unknown 9396 1727204051.32524: variable 'ansible_connection' from source: unknown 9396 1727204051.32526: variable 'ansible_module_compression' from source: unknown 9396 1727204051.32528: variable 'ansible_shell_type' from source: unknown 9396 1727204051.32618: variable 'ansible_shell_executable' from source: unknown 9396 1727204051.32622: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.32624: variable 'ansible_pipelining' from source: unknown 9396 1727204051.32627: variable 'ansible_timeout' from source: unknown 9396 1727204051.32630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.32759: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204051.32777: variable 'omit' from source: magic vars 9396 1727204051.32791: starting attempt loop 9396 1727204051.32800: running the handler 9396 1727204051.32825: handler run complete 9396 1727204051.32849: attempt loop complete, returning result 9396 1727204051.32862: _execute() done 9396 1727204051.32873: dumping result to json 9396 1727204051.32882: done dumping result, returning 9396 1727204051.32899: done running TaskExecutor() for managed-node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-36c5-1f9e-000000000447] 9396 1727204051.32945: sending task result for task 12b410aa-8751-36c5-1f9e-000000000447 ok: [managed-node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 9396 1727204051.33234: no more pending results, returning what we have 9396 1727204051.33239: results queue empty 9396 1727204051.33240: checking for any_errors_fatal 9396 1727204051.33249: done checking for any_errors_fatal 9396 1727204051.33251: checking for max_fail_percentage 9396 1727204051.33252: done checking for max_fail_percentage 9396 1727204051.33254: checking to see if all hosts have failed and the running result is not ok 9396 1727204051.33255: done checking to see if all hosts have failed 9396 1727204051.33256: getting the remaining hosts for this loop 9396 1727204051.33258: done getting the remaining hosts for this loop 9396 1727204051.33263: getting the next task for host managed-node1 9396 1727204051.33277: done getting next task for host managed-node1 9396 1727204051.33281: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 9396 1727204051.33286: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204051.33494: getting variables 9396 1727204051.33497: in VariableManager get_vars() 9396 1727204051.33543: Calling all_inventory to load vars for managed-node1 9396 1727204051.33547: Calling groups_inventory to load vars for managed-node1 9396 1727204051.33550: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204051.33566: Calling all_plugins_play to load vars for managed-node1 9396 1727204051.33570: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204051.33574: Calling groups_plugins_play to load vars for managed-node1 9396 1727204051.34378: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000447 9396 1727204051.34383: WORKER PROCESS EXITING 9396 1727204051.36350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204051.39822: done with get_vars() 9396 1727204051.39872: done getting variables 9396 1727204051.39955: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204051.40116: variable 'profile' from source: include params 9396 1727204051.40121: variable 'item' from source: include params 9396 1727204051.40197: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.107) 0:00:27.373 ***** 9396 1727204051.40244: entering _queue_task() for managed-node1/command 9396 1727204051.40814: worker is 1 (out of 1 available) 9396 1727204051.40825: exiting _queue_task() for managed-node1/command 9396 1727204051.40836: done queuing things up, now waiting for results queue to drain 9396 1727204051.40838: waiting for pending results... 9396 1727204051.41012: running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 9396 1727204051.41257: in run() - task 12b410aa-8751-36c5-1f9e-000000000449 9396 1727204051.41417: variable 'ansible_search_path' from source: unknown 9396 1727204051.41429: variable 'ansible_search_path' from source: unknown 9396 1727204051.41475: calling self._execute() 9396 1727204051.41684: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.41703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.41727: variable 'omit' from source: magic vars 9396 1727204051.42228: variable 'ansible_distribution_major_version' from source: facts 9396 1727204051.42256: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204051.42444: variable 'profile_stat' from source: set_fact 9396 1727204051.42484: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204051.42571: when evaluation is False, skipping this task 9396 1727204051.42574: _execute() done 9396 1727204051.42577: dumping result to json 9396 1727204051.42579: done dumping result, returning 9396 1727204051.42582: done running TaskExecutor() for managed-node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [12b410aa-8751-36c5-1f9e-000000000449] 9396 1727204051.42589: sending task result for task 12b410aa-8751-36c5-1f9e-000000000449 9396 1727204051.42669: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000449 9396 1727204051.42672: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204051.42869: no more pending results, returning what we have 9396 1727204051.42875: results queue empty 9396 1727204051.42877: checking for any_errors_fatal 9396 1727204051.42886: done checking for any_errors_fatal 9396 1727204051.42887: checking for max_fail_percentage 9396 1727204051.42892: done checking for max_fail_percentage 9396 1727204051.42894: checking to see if all hosts have failed and the running result is not ok 9396 1727204051.42895: done checking to see if all hosts have failed 9396 1727204051.42896: getting the remaining hosts for this loop 9396 1727204051.42897: done getting the remaining hosts for this loop 9396 1727204051.42903: getting the next task for host managed-node1 9396 1727204051.42916: done getting next task for host managed-node1 9396 1727204051.42920: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 9396 1727204051.42925: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204051.42930: getting variables 9396 1727204051.42932: in VariableManager get_vars() 9396 1727204051.42982: Calling all_inventory to load vars for managed-node1 9396 1727204051.42987: Calling groups_inventory to load vars for managed-node1 9396 1727204051.43172: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204051.43185: Calling all_plugins_play to load vars for managed-node1 9396 1727204051.43192: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204051.43197: Calling groups_plugins_play to load vars for managed-node1 9396 1727204051.45752: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204051.48835: done with get_vars() 9396 1727204051.48880: done getting variables 9396 1727204051.49010: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204051.49159: variable 'profile' from source: include params 9396 1727204051.49163: variable 'item' from source: include params 9396 1727204051.49246: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.090) 0:00:27.463 ***** 9396 1727204051.49525: entering _queue_task() for managed-node1/set_fact 9396 1727204051.50133: worker is 1 (out of 1 available) 9396 1727204051.50143: exiting _queue_task() for managed-node1/set_fact 9396 1727204051.50156: done queuing things up, now waiting for results queue to drain 9396 1727204051.50158: waiting for pending results... 9396 1727204051.50458: running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 9396 1727204051.50617: in run() - task 12b410aa-8751-36c5-1f9e-00000000044a 9396 1727204051.50645: variable 'ansible_search_path' from source: unknown 9396 1727204051.50649: variable 'ansible_search_path' from source: unknown 9396 1727204051.50687: calling self._execute() 9396 1727204051.50782: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.50834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.50838: variable 'omit' from source: magic vars 9396 1727204051.51270: variable 'ansible_distribution_major_version' from source: facts 9396 1727204051.51284: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204051.51453: variable 'profile_stat' from source: set_fact 9396 1727204051.51467: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204051.51471: when evaluation is False, skipping this task 9396 1727204051.51494: _execute() done 9396 1727204051.51499: dumping result to json 9396 1727204051.51501: done dumping result, returning 9396 1727204051.51504: done running TaskExecutor() for managed-node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [12b410aa-8751-36c5-1f9e-00000000044a] 9396 1727204051.51507: sending task result for task 12b410aa-8751-36c5-1f9e-00000000044a 9396 1727204051.51738: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000044a 9396 1727204051.51742: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204051.51788: no more pending results, returning what we have 9396 1727204051.51795: results queue empty 9396 1727204051.51797: checking for any_errors_fatal 9396 1727204051.51803: done checking for any_errors_fatal 9396 1727204051.51804: checking for max_fail_percentage 9396 1727204051.51806: done checking for max_fail_percentage 9396 1727204051.51809: checking to see if all hosts have failed and the running result is not ok 9396 1727204051.51811: done checking to see if all hosts have failed 9396 1727204051.51812: getting the remaining hosts for this loop 9396 1727204051.51813: done getting the remaining hosts for this loop 9396 1727204051.51817: getting the next task for host managed-node1 9396 1727204051.51824: done getting next task for host managed-node1 9396 1727204051.51827: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 9396 1727204051.51831: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204051.51835: getting variables 9396 1727204051.51837: in VariableManager get_vars() 9396 1727204051.51874: Calling all_inventory to load vars for managed-node1 9396 1727204051.51877: Calling groups_inventory to load vars for managed-node1 9396 1727204051.51880: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204051.51894: Calling all_plugins_play to load vars for managed-node1 9396 1727204051.51898: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204051.51902: Calling groups_plugins_play to load vars for managed-node1 9396 1727204051.54085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204051.57099: done with get_vars() 9396 1727204051.57140: done getting variables 9396 1727204051.57222: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204051.57355: variable 'profile' from source: include params 9396 1727204051.57360: variable 'item' from source: include params 9396 1727204051.57509: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.082) 0:00:27.546 ***** 9396 1727204051.57546: entering _queue_task() for managed-node1/command 9396 1727204051.57870: worker is 1 (out of 1 available) 9396 1727204051.57886: exiting _queue_task() for managed-node1/command 9396 1727204051.57901: done queuing things up, now waiting for results queue to drain 9396 1727204051.57903: waiting for pending results... 9396 1727204051.58607: running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 9396 1727204051.58612: in run() - task 12b410aa-8751-36c5-1f9e-00000000044b 9396 1727204051.58616: variable 'ansible_search_path' from source: unknown 9396 1727204051.58619: variable 'ansible_search_path' from source: unknown 9396 1727204051.58623: calling self._execute() 9396 1727204051.58627: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.58630: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.58633: variable 'omit' from source: magic vars 9396 1727204051.58999: variable 'ansible_distribution_major_version' from source: facts 9396 1727204051.59016: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204051.59170: variable 'profile_stat' from source: set_fact 9396 1727204051.59190: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204051.59194: when evaluation is False, skipping this task 9396 1727204051.59198: _execute() done 9396 1727204051.59204: dumping result to json 9396 1727204051.59208: done dumping result, returning 9396 1727204051.59219: done running TaskExecutor() for managed-node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 [12b410aa-8751-36c5-1f9e-00000000044b] 9396 1727204051.59227: sending task result for task 12b410aa-8751-36c5-1f9e-00000000044b skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204051.59375: no more pending results, returning what we have 9396 1727204051.59381: results queue empty 9396 1727204051.59382: checking for any_errors_fatal 9396 1727204051.59392: done checking for any_errors_fatal 9396 1727204051.59393: checking for max_fail_percentage 9396 1727204051.59395: done checking for max_fail_percentage 9396 1727204051.59396: checking to see if all hosts have failed and the running result is not ok 9396 1727204051.59397: done checking to see if all hosts have failed 9396 1727204051.59398: getting the remaining hosts for this loop 9396 1727204051.59400: done getting the remaining hosts for this loop 9396 1727204051.59405: getting the next task for host managed-node1 9396 1727204051.59416: done getting next task for host managed-node1 9396 1727204051.59419: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 9396 1727204051.59424: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204051.59430: getting variables 9396 1727204051.59432: in VariableManager get_vars() 9396 1727204051.59477: Calling all_inventory to load vars for managed-node1 9396 1727204051.59480: Calling groups_inventory to load vars for managed-node1 9396 1727204051.59484: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204051.59700: Calling all_plugins_play to load vars for managed-node1 9396 1727204051.59705: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204051.59712: Calling groups_plugins_play to load vars for managed-node1 9396 1727204051.60508: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000044b 9396 1727204051.60512: WORKER PROCESS EXITING 9396 1727204051.63612: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204051.68037: done with get_vars() 9396 1727204051.68078: done getting variables 9396 1727204051.68357: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204051.68693: variable 'profile' from source: include params 9396 1727204051.68698: variable 'item' from source: include params 9396 1727204051.68780: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.112) 0:00:27.659 ***** 9396 1727204051.68821: entering _queue_task() for managed-node1/set_fact 9396 1727204051.69600: worker is 1 (out of 1 available) 9396 1727204051.69618: exiting _queue_task() for managed-node1/set_fact 9396 1727204051.69631: done queuing things up, now waiting for results queue to drain 9396 1727204051.69633: waiting for pending results... 9396 1727204051.70247: running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 9396 1727204051.70377: in run() - task 12b410aa-8751-36c5-1f9e-00000000044c 9396 1727204051.70497: variable 'ansible_search_path' from source: unknown 9396 1727204051.70500: variable 'ansible_search_path' from source: unknown 9396 1727204051.70503: calling self._execute() 9396 1727204051.70544: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.70557: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.70570: variable 'omit' from source: magic vars 9396 1727204051.71110: variable 'ansible_distribution_major_version' from source: facts 9396 1727204051.71113: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204051.71518: variable 'profile_stat' from source: set_fact 9396 1727204051.71534: Evaluated conditional (profile_stat.stat.exists): False 9396 1727204051.71537: when evaluation is False, skipping this task 9396 1727204051.71542: _execute() done 9396 1727204051.71547: dumping result to json 9396 1727204051.71552: done dumping result, returning 9396 1727204051.71559: done running TaskExecutor() for managed-node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [12b410aa-8751-36c5-1f9e-00000000044c] 9396 1727204051.71566: sending task result for task 12b410aa-8751-36c5-1f9e-00000000044c 9396 1727204051.71672: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000044c 9396 1727204051.71677: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 9396 1727204051.71737: no more pending results, returning what we have 9396 1727204051.71742: results queue empty 9396 1727204051.71744: checking for any_errors_fatal 9396 1727204051.71750: done checking for any_errors_fatal 9396 1727204051.71751: checking for max_fail_percentage 9396 1727204051.71753: done checking for max_fail_percentage 9396 1727204051.71754: checking to see if all hosts have failed and the running result is not ok 9396 1727204051.71755: done checking to see if all hosts have failed 9396 1727204051.71756: getting the remaining hosts for this loop 9396 1727204051.71757: done getting the remaining hosts for this loop 9396 1727204051.71762: getting the next task for host managed-node1 9396 1727204051.71771: done getting next task for host managed-node1 9396 1727204051.71775: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 9396 1727204051.71779: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204051.71786: getting variables 9396 1727204051.71788: in VariableManager get_vars() 9396 1727204051.71845: Calling all_inventory to load vars for managed-node1 9396 1727204051.71849: Calling groups_inventory to load vars for managed-node1 9396 1727204051.71852: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204051.71871: Calling all_plugins_play to load vars for managed-node1 9396 1727204051.71876: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204051.71880: Calling groups_plugins_play to load vars for managed-node1 9396 1727204051.75877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204051.79254: done with get_vars() 9396 1727204051.79397: done getting variables 9396 1727204051.79468: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204051.79722: variable 'profile' from source: include params 9396 1727204051.79727: variable 'item' from source: include params 9396 1727204051.79821: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.110) 0:00:27.769 ***** 9396 1727204051.79861: entering _queue_task() for managed-node1/assert 9396 1727204051.80217: worker is 1 (out of 1 available) 9396 1727204051.80231: exiting _queue_task() for managed-node1/assert 9396 1727204051.80245: done queuing things up, now waiting for results queue to drain 9396 1727204051.80247: waiting for pending results... 9396 1727204051.80705: running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.1' 9396 1727204051.80720: in run() - task 12b410aa-8751-36c5-1f9e-00000000026f 9396 1727204051.80724: variable 'ansible_search_path' from source: unknown 9396 1727204051.80728: variable 'ansible_search_path' from source: unknown 9396 1727204051.80731: calling self._execute() 9396 1727204051.80838: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.80846: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.80858: variable 'omit' from source: magic vars 9396 1727204051.81286: variable 'ansible_distribution_major_version' from source: facts 9396 1727204051.81301: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204051.81311: variable 'omit' from source: magic vars 9396 1727204051.81356: variable 'omit' from source: magic vars 9396 1727204051.81483: variable 'profile' from source: include params 9396 1727204051.81492: variable 'item' from source: include params 9396 1727204051.81571: variable 'item' from source: include params 9396 1727204051.81598: variable 'omit' from source: magic vars 9396 1727204051.81636: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204051.81797: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204051.81801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204051.81804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204051.81810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204051.81814: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204051.81817: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.81820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.81919: Set connection var ansible_timeout to 10 9396 1727204051.81936: Set connection var ansible_shell_executable to /bin/sh 9396 1727204051.81959: Set connection var ansible_pipelining to False 9396 1727204051.81962: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204051.81964: Set connection var ansible_connection to ssh 9396 1727204051.81967: Set connection var ansible_shell_type to sh 9396 1727204051.81991: variable 'ansible_shell_executable' from source: unknown 9396 1727204051.81994: variable 'ansible_connection' from source: unknown 9396 1727204051.81997: variable 'ansible_module_compression' from source: unknown 9396 1727204051.82002: variable 'ansible_shell_type' from source: unknown 9396 1727204051.82095: variable 'ansible_shell_executable' from source: unknown 9396 1727204051.82101: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.82104: variable 'ansible_pipelining' from source: unknown 9396 1727204051.82111: variable 'ansible_timeout' from source: unknown 9396 1727204051.82114: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.82372: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204051.82376: variable 'omit' from source: magic vars 9396 1727204051.82379: starting attempt loop 9396 1727204051.82381: running the handler 9396 1727204051.82595: variable 'lsr_net_profile_exists' from source: set_fact 9396 1727204051.82598: Evaluated conditional (lsr_net_profile_exists): True 9396 1727204051.82602: handler run complete 9396 1727204051.82604: attempt loop complete, returning result 9396 1727204051.82606: _execute() done 9396 1727204051.82610: dumping result to json 9396 1727204051.82612: done dumping result, returning 9396 1727204051.82614: done running TaskExecutor() for managed-node1/TASK: Assert that the profile is present - 'bond0.1' [12b410aa-8751-36c5-1f9e-00000000026f] 9396 1727204051.82616: sending task result for task 12b410aa-8751-36c5-1f9e-00000000026f 9396 1727204051.82682: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000026f 9396 1727204051.82685: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 9396 1727204051.82746: no more pending results, returning what we have 9396 1727204051.82750: results queue empty 9396 1727204051.82751: checking for any_errors_fatal 9396 1727204051.82758: done checking for any_errors_fatal 9396 1727204051.82759: checking for max_fail_percentage 9396 1727204051.82761: done checking for max_fail_percentage 9396 1727204051.82762: checking to see if all hosts have failed and the running result is not ok 9396 1727204051.82763: done checking to see if all hosts have failed 9396 1727204051.82764: getting the remaining hosts for this loop 9396 1727204051.82766: done getting the remaining hosts for this loop 9396 1727204051.82770: getting the next task for host managed-node1 9396 1727204051.82777: done getting next task for host managed-node1 9396 1727204051.82780: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 9396 1727204051.82783: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204051.82788: getting variables 9396 1727204051.82791: in VariableManager get_vars() 9396 1727204051.82836: Calling all_inventory to load vars for managed-node1 9396 1727204051.82840: Calling groups_inventory to load vars for managed-node1 9396 1727204051.82843: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204051.82856: Calling all_plugins_play to load vars for managed-node1 9396 1727204051.82861: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204051.82865: Calling groups_plugins_play to load vars for managed-node1 9396 1727204051.85078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204051.88087: done with get_vars() 9396 1727204051.88125: done getting variables 9396 1727204051.88196: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204051.88329: variable 'profile' from source: include params 9396 1727204051.88334: variable 'item' from source: include params 9396 1727204051.88405: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.085) 0:00:27.855 ***** 9396 1727204051.88445: entering _queue_task() for managed-node1/assert 9396 1727204051.88785: worker is 1 (out of 1 available) 9396 1727204051.88999: exiting _queue_task() for managed-node1/assert 9396 1727204051.89010: done queuing things up, now waiting for results queue to drain 9396 1727204051.89012: waiting for pending results... 9396 1727204051.89413: running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' 9396 1727204051.89419: in run() - task 12b410aa-8751-36c5-1f9e-000000000270 9396 1727204051.89423: variable 'ansible_search_path' from source: unknown 9396 1727204051.89426: variable 'ansible_search_path' from source: unknown 9396 1727204051.89430: calling self._execute() 9396 1727204051.89433: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.89436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.89438: variable 'omit' from source: magic vars 9396 1727204051.89910: variable 'ansible_distribution_major_version' from source: facts 9396 1727204051.89914: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204051.89917: variable 'omit' from source: magic vars 9396 1727204051.89947: variable 'omit' from source: magic vars 9396 1727204051.90069: variable 'profile' from source: include params 9396 1727204051.90073: variable 'item' from source: include params 9396 1727204051.90150: variable 'item' from source: include params 9396 1727204051.90174: variable 'omit' from source: magic vars 9396 1727204051.90343: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204051.90347: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204051.90350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204051.90353: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204051.90355: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204051.90452: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204051.90456: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.90459: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.90487: Set connection var ansible_timeout to 10 9396 1727204051.90497: Set connection var ansible_shell_executable to /bin/sh 9396 1727204051.90511: Set connection var ansible_pipelining to False 9396 1727204051.90517: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204051.90525: Set connection var ansible_connection to ssh 9396 1727204051.90528: Set connection var ansible_shell_type to sh 9396 1727204051.90562: variable 'ansible_shell_executable' from source: unknown 9396 1727204051.90566: variable 'ansible_connection' from source: unknown 9396 1727204051.90569: variable 'ansible_module_compression' from source: unknown 9396 1727204051.90573: variable 'ansible_shell_type' from source: unknown 9396 1727204051.90576: variable 'ansible_shell_executable' from source: unknown 9396 1727204051.90581: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.90587: variable 'ansible_pipelining' from source: unknown 9396 1727204051.90591: variable 'ansible_timeout' from source: unknown 9396 1727204051.90598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.90778: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204051.90783: variable 'omit' from source: magic vars 9396 1727204051.90785: starting attempt loop 9396 1727204051.90788: running the handler 9396 1727204051.90996: variable 'lsr_net_profile_ansible_managed' from source: set_fact 9396 1727204051.90999: Evaluated conditional (lsr_net_profile_ansible_managed): True 9396 1727204051.91002: handler run complete 9396 1727204051.91005: attempt loop complete, returning result 9396 1727204051.91010: _execute() done 9396 1727204051.91013: dumping result to json 9396 1727204051.91016: done dumping result, returning 9396 1727204051.91018: done running TaskExecutor() for managed-node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' [12b410aa-8751-36c5-1f9e-000000000270] 9396 1727204051.91020: sending task result for task 12b410aa-8751-36c5-1f9e-000000000270 9396 1727204051.91083: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000270 9396 1727204051.91086: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 9396 1727204051.91146: no more pending results, returning what we have 9396 1727204051.91151: results queue empty 9396 1727204051.91152: checking for any_errors_fatal 9396 1727204051.91162: done checking for any_errors_fatal 9396 1727204051.91163: checking for max_fail_percentage 9396 1727204051.91165: done checking for max_fail_percentage 9396 1727204051.91166: checking to see if all hosts have failed and the running result is not ok 9396 1727204051.91167: done checking to see if all hosts have failed 9396 1727204051.91168: getting the remaining hosts for this loop 9396 1727204051.91170: done getting the remaining hosts for this loop 9396 1727204051.91175: getting the next task for host managed-node1 9396 1727204051.91182: done getting next task for host managed-node1 9396 1727204051.91186: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 9396 1727204051.91192: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204051.91198: getting variables 9396 1727204051.91200: in VariableManager get_vars() 9396 1727204051.91245: Calling all_inventory to load vars for managed-node1 9396 1727204051.91249: Calling groups_inventory to load vars for managed-node1 9396 1727204051.91252: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204051.91265: Calling all_plugins_play to load vars for managed-node1 9396 1727204051.91269: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204051.91273: Calling groups_plugins_play to load vars for managed-node1 9396 1727204051.93721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204051.96619: done with get_vars() 9396 1727204051.96660: done getting variables 9396 1727204051.96734: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204051.96865: variable 'profile' from source: include params 9396 1727204051.96869: variable 'item' from source: include params 9396 1727204051.96943: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Tuesday 24 September 2024 14:54:11 -0400 (0:00:00.085) 0:00:27.940 ***** 9396 1727204051.96984: entering _queue_task() for managed-node1/assert 9396 1727204051.97337: worker is 1 (out of 1 available) 9396 1727204051.97351: exiting _queue_task() for managed-node1/assert 9396 1727204051.97365: done queuing things up, now waiting for results queue to drain 9396 1727204051.97367: waiting for pending results... 9396 1727204051.97914: running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.1 9396 1727204051.97921: in run() - task 12b410aa-8751-36c5-1f9e-000000000271 9396 1727204051.97926: variable 'ansible_search_path' from source: unknown 9396 1727204051.97930: variable 'ansible_search_path' from source: unknown 9396 1727204051.97934: calling self._execute() 9396 1727204051.98099: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.98103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.98109: variable 'omit' from source: magic vars 9396 1727204051.98431: variable 'ansible_distribution_major_version' from source: facts 9396 1727204051.98444: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204051.98452: variable 'omit' from source: magic vars 9396 1727204051.98598: variable 'omit' from source: magic vars 9396 1727204051.98620: variable 'profile' from source: include params 9396 1727204051.98624: variable 'item' from source: include params 9396 1727204051.98897: variable 'item' from source: include params 9396 1727204051.98901: variable 'omit' from source: magic vars 9396 1727204051.98904: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204051.98910: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204051.98913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204051.98915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204051.98918: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204051.98921: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204051.98924: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.98927: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.99049: Set connection var ansible_timeout to 10 9396 1727204051.99058: Set connection var ansible_shell_executable to /bin/sh 9396 1727204051.99069: Set connection var ansible_pipelining to False 9396 1727204051.99077: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204051.99084: Set connection var ansible_connection to ssh 9396 1727204051.99088: Set connection var ansible_shell_type to sh 9396 1727204051.99120: variable 'ansible_shell_executable' from source: unknown 9396 1727204051.99124: variable 'ansible_connection' from source: unknown 9396 1727204051.99126: variable 'ansible_module_compression' from source: unknown 9396 1727204051.99131: variable 'ansible_shell_type' from source: unknown 9396 1727204051.99134: variable 'ansible_shell_executable' from source: unknown 9396 1727204051.99140: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204051.99145: variable 'ansible_pipelining' from source: unknown 9396 1727204051.99154: variable 'ansible_timeout' from source: unknown 9396 1727204051.99160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204051.99398: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204051.99402: variable 'omit' from source: magic vars 9396 1727204051.99405: starting attempt loop 9396 1727204051.99410: running the handler 9396 1727204051.99490: variable 'lsr_net_profile_fingerprint' from source: set_fact 9396 1727204051.99509: Evaluated conditional (lsr_net_profile_fingerprint): True 9396 1727204051.99513: handler run complete 9396 1727204051.99598: attempt loop complete, returning result 9396 1727204051.99602: _execute() done 9396 1727204051.99605: dumping result to json 9396 1727204051.99611: done dumping result, returning 9396 1727204051.99614: done running TaskExecutor() for managed-node1/TASK: Assert that the fingerprint comment is present in bond0.1 [12b410aa-8751-36c5-1f9e-000000000271] 9396 1727204051.99616: sending task result for task 12b410aa-8751-36c5-1f9e-000000000271 9396 1727204051.99679: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000271 9396 1727204051.99681: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false } MSG: All assertions passed 9396 1727204051.99747: no more pending results, returning what we have 9396 1727204051.99752: results queue empty 9396 1727204051.99753: checking for any_errors_fatal 9396 1727204051.99762: done checking for any_errors_fatal 9396 1727204051.99763: checking for max_fail_percentage 9396 1727204051.99765: done checking for max_fail_percentage 9396 1727204051.99766: checking to see if all hosts have failed and the running result is not ok 9396 1727204051.99767: done checking to see if all hosts have failed 9396 1727204051.99768: getting the remaining hosts for this loop 9396 1727204051.99770: done getting the remaining hosts for this loop 9396 1727204051.99775: getting the next task for host managed-node1 9396 1727204051.99784: done getting next task for host managed-node1 9396 1727204051.99787: ^ task is: TASK: ** TEST check polling interval 9396 1727204051.99791: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204051.99797: getting variables 9396 1727204051.99799: in VariableManager get_vars() 9396 1727204051.99846: Calling all_inventory to load vars for managed-node1 9396 1727204051.99850: Calling groups_inventory to load vars for managed-node1 9396 1727204051.99854: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204051.99867: Calling all_plugins_play to load vars for managed-node1 9396 1727204051.99871: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204051.99876: Calling groups_plugins_play to load vars for managed-node1 9396 1727204052.02325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204052.05475: done with get_vars() 9396 1727204052.05737: done getting variables 9396 1727204052.05811: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:75 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.088) 0:00:28.029 ***** 9396 1727204052.05846: entering _queue_task() for managed-node1/command 9396 1727204052.06631: worker is 1 (out of 1 available) 9396 1727204052.06646: exiting _queue_task() for managed-node1/command 9396 1727204052.06659: done queuing things up, now waiting for results queue to drain 9396 1727204052.06661: waiting for pending results... 9396 1727204052.07477: running TaskExecutor() for managed-node1/TASK: ** TEST check polling interval 9396 1727204052.07494: in run() - task 12b410aa-8751-36c5-1f9e-000000000071 9396 1727204052.07681: variable 'ansible_search_path' from source: unknown 9396 1727204052.07686: calling self._execute() 9396 1727204052.07871: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204052.08099: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204052.08103: variable 'omit' from source: magic vars 9396 1727204052.08927: variable 'ansible_distribution_major_version' from source: facts 9396 1727204052.09013: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204052.09033: variable 'omit' from source: magic vars 9396 1727204052.09069: variable 'omit' from source: magic vars 9396 1727204052.09217: variable 'controller_device' from source: play vars 9396 1727204052.09243: variable 'omit' from source: magic vars 9396 1727204052.09294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204052.09331: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204052.09358: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204052.09380: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204052.09410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204052.09497: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204052.09501: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204052.09505: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204052.09574: Set connection var ansible_timeout to 10 9396 1727204052.09581: Set connection var ansible_shell_executable to /bin/sh 9396 1727204052.09594: Set connection var ansible_pipelining to False 9396 1727204052.09602: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204052.09611: Set connection var ansible_connection to ssh 9396 1727204052.09614: Set connection var ansible_shell_type to sh 9396 1727204052.09647: variable 'ansible_shell_executable' from source: unknown 9396 1727204052.09650: variable 'ansible_connection' from source: unknown 9396 1727204052.09653: variable 'ansible_module_compression' from source: unknown 9396 1727204052.09658: variable 'ansible_shell_type' from source: unknown 9396 1727204052.09661: variable 'ansible_shell_executable' from source: unknown 9396 1727204052.09666: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204052.09735: variable 'ansible_pipelining' from source: unknown 9396 1727204052.09747: variable 'ansible_timeout' from source: unknown 9396 1727204052.09750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204052.09851: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204052.09867: variable 'omit' from source: magic vars 9396 1727204052.09875: starting attempt loop 9396 1727204052.09878: running the handler 9396 1727204052.09901: _low_level_execute_command(): starting 9396 1727204052.09912: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204052.10729: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.10795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204052.10880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204052.10885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204052.12645: stdout chunk (state=3): >>>/root <<< 9396 1727204052.12750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204052.12995: stderr chunk (state=3): >>><<< 9396 1727204052.12999: stdout chunk (state=3): >>><<< 9396 1727204052.13002: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204052.13005: _low_level_execute_command(): starting 9396 1727204052.13008: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562 `" && echo ansible-tmp-1727204052.1284423-11476-195974357426562="` echo /root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562 `" ) && sleep 0' 9396 1727204052.13477: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204052.13488: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204052.13502: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204052.13521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204052.13534: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204052.13543: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204052.13554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.13576: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204052.13580: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 9396 1727204052.13587: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 9396 1727204052.13604: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204052.13678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204052.13683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204052.13685: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204052.13687: stderr chunk (state=3): >>>debug2: match found <<< 9396 1727204052.13689: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.13741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204052.13744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204052.13765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204052.13925: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204052.15942: stdout chunk (state=3): >>>ansible-tmp-1727204052.1284423-11476-195974357426562=/root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562 <<< 9396 1727204052.16055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204052.16105: stderr chunk (state=3): >>><<< 9396 1727204052.16120: stdout chunk (state=3): >>><<< 9396 1727204052.16137: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204052.1284423-11476-195974357426562=/root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204052.16191: variable 'ansible_module_compression' from source: unknown 9396 1727204052.16229: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9396 1727204052.16263: variable 'ansible_facts' from source: unknown 9396 1727204052.16328: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562/AnsiballZ_command.py 9396 1727204052.16445: Sending initial data 9396 1727204052.16449: Sent initial data (155 bytes) 9396 1727204052.16848: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204052.16884: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204052.16887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.16894: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204052.16897: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.16946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204052.16953: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204052.17000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204052.18761: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204052.18822: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204052.18826: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562/AnsiballZ_command.py" <<< 9396 1727204052.18829: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmppyfhnm0a /root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562/AnsiballZ_command.py <<< 9396 1727204052.18865: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmppyfhnm0a" to remote "/root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562/AnsiballZ_command.py" <<< 9396 1727204052.20340: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204052.20400: stderr chunk (state=3): >>><<< 9396 1727204052.20404: stdout chunk (state=3): >>><<< 9396 1727204052.20428: done transferring module to remote 9396 1727204052.20441: _low_level_execute_command(): starting 9396 1727204052.20446: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562/ /root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562/AnsiballZ_command.py && sleep 0' 9396 1727204052.20881: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204052.20885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.20888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204052.20896: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204052.20899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.20946: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204052.20952: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204052.20998: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204052.22983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204052.23041: stderr chunk (state=3): >>><<< 9396 1727204052.23071: stdout chunk (state=3): >>><<< 9396 1727204052.23108: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204052.23115: _low_level_execute_command(): starting 9396 1727204052.23123: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562/AnsiballZ_command.py && sleep 0' 9396 1727204052.23584: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204052.23605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204052.23624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.23673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204052.23677: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204052.23740: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204052.41709: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-24 14:54:12.412477", "end": "2024-09-24 14:54:12.416193", "delta": "0:00:00.003716", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9396 1727204052.43450: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204052.43508: stderr chunk (state=3): >>><<< 9396 1727204052.43512: stdout chunk (state=3): >>><<< 9396 1727204052.43530: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/deprecated-bond"], "start": "2024-09-24 14:54:12.412477", "end": "2024-09-24 14:54:12.416193", "delta": "0:00:00.003716", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204052.43566: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/deprecated-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204052.43575: _low_level_execute_command(): starting 9396 1727204052.43581: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204052.1284423-11476-195974357426562/ > /dev/null 2>&1 && sleep 0' 9396 1727204052.44050: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204052.44054: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.44061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204052.44065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.44105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204052.44112: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204052.44221: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204052.46283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204052.46287: stdout chunk (state=3): >>><<< 9396 1727204052.46396: stderr chunk (state=3): >>><<< 9396 1727204052.46416: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204052.46425: handler run complete 9396 1727204052.46458: Evaluated conditional (False): False 9396 1727204052.46871: variable 'result' from source: unknown 9396 1727204052.46893: Evaluated conditional ('110' in result.stdout): True 9396 1727204052.47026: attempt loop complete, returning result 9396 1727204052.47030: _execute() done 9396 1727204052.47033: dumping result to json 9396 1727204052.47040: done dumping result, returning 9396 1727204052.47050: done running TaskExecutor() for managed-node1/TASK: ** TEST check polling interval [12b410aa-8751-36c5-1f9e-000000000071] 9396 1727204052.47058: sending task result for task 12b410aa-8751-36c5-1f9e-000000000071 9396 1727204052.47353: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000071 9396 1727204052.47356: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/deprecated-bond" ], "delta": "0:00:00.003716", "end": "2024-09-24 14:54:12.416193", "rc": 0, "start": "2024-09-24 14:54:12.412477" } STDOUT: MII Polling Interval (ms): 110 9396 1727204052.47460: no more pending results, returning what we have 9396 1727204052.47465: results queue empty 9396 1727204052.47466: checking for any_errors_fatal 9396 1727204052.47474: done checking for any_errors_fatal 9396 1727204052.47475: checking for max_fail_percentage 9396 1727204052.47477: done checking for max_fail_percentage 9396 1727204052.47478: checking to see if all hosts have failed and the running result is not ok 9396 1727204052.47479: done checking to see if all hosts have failed 9396 1727204052.47480: getting the remaining hosts for this loop 9396 1727204052.47482: done getting the remaining hosts for this loop 9396 1727204052.47486: getting the next task for host managed-node1 9396 1727204052.47695: done getting next task for host managed-node1 9396 1727204052.47699: ^ task is: TASK: ** TEST check IPv4 9396 1727204052.47701: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204052.47705: getting variables 9396 1727204052.47709: in VariableManager get_vars() 9396 1727204052.47749: Calling all_inventory to load vars for managed-node1 9396 1727204052.47752: Calling groups_inventory to load vars for managed-node1 9396 1727204052.47755: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204052.47765: Calling all_plugins_play to load vars for managed-node1 9396 1727204052.47769: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204052.47772: Calling groups_plugins_play to load vars for managed-node1 9396 1727204052.50975: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204052.53898: done with get_vars() 9396 1727204052.53949: done getting variables 9396 1727204052.54031: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:80 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.482) 0:00:28.511 ***** 9396 1727204052.54067: entering _queue_task() for managed-node1/command 9396 1727204052.54463: worker is 1 (out of 1 available) 9396 1727204052.54477: exiting _queue_task() for managed-node1/command 9396 1727204052.54694: done queuing things up, now waiting for results queue to drain 9396 1727204052.54697: waiting for pending results... 9396 1727204052.54901: running TaskExecutor() for managed-node1/TASK: ** TEST check IPv4 9396 1727204052.54946: in run() - task 12b410aa-8751-36c5-1f9e-000000000072 9396 1727204052.54970: variable 'ansible_search_path' from source: unknown 9396 1727204052.55088: calling self._execute() 9396 1727204052.55151: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204052.55165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204052.55184: variable 'omit' from source: magic vars 9396 1727204052.55648: variable 'ansible_distribution_major_version' from source: facts 9396 1727204052.55672: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204052.55690: variable 'omit' from source: magic vars 9396 1727204052.55721: variable 'omit' from source: magic vars 9396 1727204052.55838: variable 'controller_device' from source: play vars 9396 1727204052.55896: variable 'omit' from source: magic vars 9396 1727204052.55926: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204052.55969: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204052.55999: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204052.56034: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204052.56118: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204052.56121: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204052.56124: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204052.56126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204052.56241: Set connection var ansible_timeout to 10 9396 1727204052.56254: Set connection var ansible_shell_executable to /bin/sh 9396 1727204052.56269: Set connection var ansible_pipelining to False 9396 1727204052.56280: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204052.56293: Set connection var ansible_connection to ssh 9396 1727204052.56300: Set connection var ansible_shell_type to sh 9396 1727204052.56342: variable 'ansible_shell_executable' from source: unknown 9396 1727204052.56351: variable 'ansible_connection' from source: unknown 9396 1727204052.56360: variable 'ansible_module_compression' from source: unknown 9396 1727204052.56367: variable 'ansible_shell_type' from source: unknown 9396 1727204052.56374: variable 'ansible_shell_executable' from source: unknown 9396 1727204052.56381: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204052.56391: variable 'ansible_pipelining' from source: unknown 9396 1727204052.56445: variable 'ansible_timeout' from source: unknown 9396 1727204052.56448: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204052.56585: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204052.56610: variable 'omit' from source: magic vars 9396 1727204052.56622: starting attempt loop 9396 1727204052.56630: running the handler 9396 1727204052.56651: _low_level_execute_command(): starting 9396 1727204052.56669: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204052.57516: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.57605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204052.57641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204052.57732: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204052.59489: stdout chunk (state=3): >>>/root <<< 9396 1727204052.59685: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204052.59691: stdout chunk (state=3): >>><<< 9396 1727204052.59694: stderr chunk (state=3): >>><<< 9396 1727204052.59719: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204052.59740: _low_level_execute_command(): starting 9396 1727204052.59837: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363 `" && echo ansible-tmp-1727204052.5972588-11503-24939527759363="` echo /root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363 `" ) && sleep 0' 9396 1727204052.60411: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204052.60429: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204052.60522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.60582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204052.60616: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204052.60699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204052.62728: stdout chunk (state=3): >>>ansible-tmp-1727204052.5972588-11503-24939527759363=/root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363 <<< 9396 1727204052.62918: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204052.62934: stderr chunk (state=3): >>><<< 9396 1727204052.62944: stdout chunk (state=3): >>><<< 9396 1727204052.62972: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204052.5972588-11503-24939527759363=/root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204052.63024: variable 'ansible_module_compression' from source: unknown 9396 1727204052.63088: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9396 1727204052.63146: variable 'ansible_facts' from source: unknown 9396 1727204052.63253: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363/AnsiballZ_command.py 9396 1727204052.63528: Sending initial data 9396 1727204052.63531: Sent initial data (154 bytes) 9396 1727204052.64004: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204052.64015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204052.64027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204052.64043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204052.64104: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.64151: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204052.64171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204052.64184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204052.64263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204052.65937: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204052.65980: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204052.66029: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpyyoecinz /root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363/AnsiballZ_command.py <<< 9396 1727204052.66032: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363/AnsiballZ_command.py" <<< 9396 1727204052.66100: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpyyoecinz" to remote "/root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363/AnsiballZ_command.py" <<< 9396 1727204052.67176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204052.67284: stderr chunk (state=3): >>><<< 9396 1727204052.67394: stdout chunk (state=3): >>><<< 9396 1727204052.67400: done transferring module to remote 9396 1727204052.67403: _low_level_execute_command(): starting 9396 1727204052.67406: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363/ /root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363/AnsiballZ_command.py && sleep 0' 9396 1727204052.68026: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204052.68049: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204052.68069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204052.68168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.68204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204052.68223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204052.68250: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204052.68332: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204052.70377: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204052.70390: stdout chunk (state=3): >>><<< 9396 1727204052.70409: stderr chunk (state=3): >>><<< 9396 1727204052.70432: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204052.70530: _low_level_execute_command(): starting 9396 1727204052.70535: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363/AnsiballZ_command.py && sleep 0' 9396 1727204052.71105: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204052.71120: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204052.71207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.71260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204052.71281: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204052.71300: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204052.71393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204052.90823: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.217/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 234sec preferred_lft 234sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-24 14:54:12.903703", "end": "2024-09-24 14:54:12.907349", "delta": "0:00:00.003646", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9396 1727204052.92499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204052.92557: stderr chunk (state=3): >>><<< 9396 1727204052.92561: stdout chunk (state=3): >>><<< 9396 1727204052.92579: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.217/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond\n valid_lft 234sec preferred_lft 234sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "deprecated-bond"], "start": "2024-09-24 14:54:12.903703", "end": "2024-09-24 14:54:12.907349", "delta": "0:00:00.003646", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204052.92617: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204052.92626: _low_level_execute_command(): starting 9396 1727204052.92632: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204052.5972588-11503-24939527759363/ > /dev/null 2>&1 && sleep 0' 9396 1727204052.93080: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204052.93128: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204052.93131: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204052.93134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 9396 1727204052.93137: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204052.93144: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204052.93194: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204052.93198: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204052.93199: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204052.93242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204052.95165: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204052.95219: stderr chunk (state=3): >>><<< 9396 1727204052.95222: stdout chunk (state=3): >>><<< 9396 1727204052.95239: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204052.95246: handler run complete 9396 1727204052.95268: Evaluated conditional (False): False 9396 1727204052.95404: variable 'result' from source: set_fact 9396 1727204052.95423: Evaluated conditional ('192.0.2' in result.stdout): True 9396 1727204052.95435: attempt loop complete, returning result 9396 1727204052.95438: _execute() done 9396 1727204052.95441: dumping result to json 9396 1727204052.95448: done dumping result, returning 9396 1727204052.95456: done running TaskExecutor() for managed-node1/TASK: ** TEST check IPv4 [12b410aa-8751-36c5-1f9e-000000000072] 9396 1727204052.95462: sending task result for task 12b410aa-8751-36c5-1f9e-000000000072 9396 1727204052.95569: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000072 9396 1727204052.95572: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003646", "end": "2024-09-24 14:54:12.907349", "rc": 0, "start": "2024-09-24 14:54:12.903703" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.217/24 brd 192.0.2.255 scope global dynamic noprefixroute deprecated-bond valid_lft 234sec preferred_lft 234sec 9396 1727204052.95675: no more pending results, returning what we have 9396 1727204052.95679: results queue empty 9396 1727204052.95680: checking for any_errors_fatal 9396 1727204052.95691: done checking for any_errors_fatal 9396 1727204052.95692: checking for max_fail_percentage 9396 1727204052.95696: done checking for max_fail_percentage 9396 1727204052.95697: checking to see if all hosts have failed and the running result is not ok 9396 1727204052.95699: done checking to see if all hosts have failed 9396 1727204052.95699: getting the remaining hosts for this loop 9396 1727204052.95701: done getting the remaining hosts for this loop 9396 1727204052.95705: getting the next task for host managed-node1 9396 1727204052.95712: done getting next task for host managed-node1 9396 1727204052.95715: ^ task is: TASK: ** TEST check IPv6 9396 1727204052.95717: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204052.95721: getting variables 9396 1727204052.95722: in VariableManager get_vars() 9396 1727204052.95761: Calling all_inventory to load vars for managed-node1 9396 1727204052.95764: Calling groups_inventory to load vars for managed-node1 9396 1727204052.95767: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204052.95778: Calling all_plugins_play to load vars for managed-node1 9396 1727204052.95781: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204052.95785: Calling groups_plugins_play to load vars for managed-node1 9396 1727204052.97777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204052.99797: done with get_vars() 9396 1727204052.99831: done getting variables 9396 1727204052.99887: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:87 Tuesday 24 September 2024 14:54:12 -0400 (0:00:00.458) 0:00:28.970 ***** 9396 1727204052.99916: entering _queue_task() for managed-node1/command 9396 1727204053.00200: worker is 1 (out of 1 available) 9396 1727204053.00216: exiting _queue_task() for managed-node1/command 9396 1727204053.00230: done queuing things up, now waiting for results queue to drain 9396 1727204053.00233: waiting for pending results... 9396 1727204053.00434: running TaskExecutor() for managed-node1/TASK: ** TEST check IPv6 9396 1727204053.00511: in run() - task 12b410aa-8751-36c5-1f9e-000000000073 9396 1727204053.00529: variable 'ansible_search_path' from source: unknown 9396 1727204053.00576: calling self._execute() 9396 1727204053.00802: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204053.00806: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204053.00809: variable 'omit' from source: magic vars 9396 1727204053.01218: variable 'ansible_distribution_major_version' from source: facts 9396 1727204053.01250: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204053.01262: variable 'omit' from source: magic vars 9396 1727204053.01294: variable 'omit' from source: magic vars 9396 1727204053.01426: variable 'controller_device' from source: play vars 9396 1727204053.01465: variable 'omit' from source: magic vars 9396 1727204053.01523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204053.01584: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204053.01617: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204053.01644: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204053.01664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204053.01720: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204053.01730: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204053.01740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204053.01868: Set connection var ansible_timeout to 10 9396 1727204053.01874: Set connection var ansible_shell_executable to /bin/sh 9396 1727204053.01883: Set connection var ansible_pipelining to False 9396 1727204053.01895: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204053.01906: Set connection var ansible_connection to ssh 9396 1727204053.01912: Set connection var ansible_shell_type to sh 9396 1727204053.01934: variable 'ansible_shell_executable' from source: unknown 9396 1727204053.01937: variable 'ansible_connection' from source: unknown 9396 1727204053.01941: variable 'ansible_module_compression' from source: unknown 9396 1727204053.01946: variable 'ansible_shell_type' from source: unknown 9396 1727204053.01949: variable 'ansible_shell_executable' from source: unknown 9396 1727204053.01953: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204053.01958: variable 'ansible_pipelining' from source: unknown 9396 1727204053.01965: variable 'ansible_timeout' from source: unknown 9396 1727204053.01974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204053.02102: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204053.02122: variable 'omit' from source: magic vars 9396 1727204053.02125: starting attempt loop 9396 1727204053.02128: running the handler 9396 1727204053.02146: _low_level_execute_command(): starting 9396 1727204053.02153: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204053.02680: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204053.02712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.02716: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204053.02718: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.02783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204053.02788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204053.02831: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204053.04637: stdout chunk (state=3): >>>/root <<< 9396 1727204053.04749: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204053.04810: stderr chunk (state=3): >>><<< 9396 1727204053.04813: stdout chunk (state=3): >>><<< 9396 1727204053.04843: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204053.04854: _low_level_execute_command(): starting 9396 1727204053.04861: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042 `" && echo ansible-tmp-1727204053.0483994-11520-48059399116042="` echo /root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042 `" ) && sleep 0' 9396 1727204053.05348: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204053.05352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204053.05354: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.05357: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204053.05368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.05418: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204053.05421: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204053.05473: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204053.07515: stdout chunk (state=3): >>>ansible-tmp-1727204053.0483994-11520-48059399116042=/root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042 <<< 9396 1727204053.07639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204053.07681: stderr chunk (state=3): >>><<< 9396 1727204053.07684: stdout chunk (state=3): >>><<< 9396 1727204053.07702: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204053.0483994-11520-48059399116042=/root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204053.07737: variable 'ansible_module_compression' from source: unknown 9396 1727204053.07782: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9396 1727204053.07818: variable 'ansible_facts' from source: unknown 9396 1727204053.07878: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042/AnsiballZ_command.py 9396 1727204053.08000: Sending initial data 9396 1727204053.08004: Sent initial data (154 bytes) 9396 1727204053.08460: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204053.08464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.08467: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204053.08469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.08529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204053.08532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204053.08574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204053.10247: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204053.10288: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204053.10324: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpfy7bjyui /root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042/AnsiballZ_command.py <<< 9396 1727204053.10332: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042/AnsiballZ_command.py" <<< 9396 1727204053.10366: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpfy7bjyui" to remote "/root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042/AnsiballZ_command.py" <<< 9396 1727204053.11159: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204053.11231: stderr chunk (state=3): >>><<< 9396 1727204053.11234: stdout chunk (state=3): >>><<< 9396 1727204053.11254: done transferring module to remote 9396 1727204053.11264: _low_level_execute_command(): starting 9396 1727204053.11271: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042/ /root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042/AnsiballZ_command.py && sleep 0' 9396 1727204053.11751: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204053.11754: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204053.11757: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.11760: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204053.11762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.11811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204053.11815: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204053.11879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204053.13782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204053.13832: stderr chunk (state=3): >>><<< 9396 1727204053.13836: stdout chunk (state=3): >>><<< 9396 1727204053.13851: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204053.13854: _low_level_execute_command(): starting 9396 1727204053.13860: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042/AnsiballZ_command.py && sleep 0' 9396 1727204053.14287: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204053.14331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204053.14334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204053.14337: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.14341: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204053.14343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204053.14346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.14395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204053.14402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204053.14448: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204053.32773: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::10a/128 scope global dynamic noprefixroute \n valid_lft 235sec preferred_lft 235sec\n inet6 2001:db8::3549:a0b1:8a62:b090/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::937a:2635:e04f:bd89/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-24 14:54:13.323156", "end": "2024-09-24 14:54:13.326817", "delta": "0:00:00.003661", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9396 1727204053.34464: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204053.34526: stderr chunk (state=3): >>><<< 9396 1727204053.34530: stdout chunk (state=3): >>><<< 9396 1727204053.34546: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::10a/128 scope global dynamic noprefixroute \n valid_lft 235sec preferred_lft 235sec\n inet6 2001:db8::3549:a0b1:8a62:b090/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::937a:2635:e04f:bd89/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "deprecated-bond"], "start": "2024-09-24 14:54:13.323156", "end": "2024-09-24 14:54:13.326817", "delta": "0:00:00.003661", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204053.34592: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204053.34602: _low_level_execute_command(): starting 9396 1727204053.34610: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204053.0483994-11520-48059399116042/ > /dev/null 2>&1 && sleep 0' 9396 1727204053.35051: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204053.35060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204053.35088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.35093: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204053.35096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.35158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204053.35164: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204053.35213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204053.37137: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204053.37181: stderr chunk (state=3): >>><<< 9396 1727204053.37185: stdout chunk (state=3): >>><<< 9396 1727204053.37202: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204053.37212: handler run complete 9396 1727204053.37236: Evaluated conditional (False): False 9396 1727204053.37372: variable 'result' from source: set_fact 9396 1727204053.37388: Evaluated conditional ('2001' in result.stdout): True 9396 1727204053.37404: attempt loop complete, returning result 9396 1727204053.37410: _execute() done 9396 1727204053.37413: dumping result to json 9396 1727204053.37418: done dumping result, returning 9396 1727204053.37426: done running TaskExecutor() for managed-node1/TASK: ** TEST check IPv6 [12b410aa-8751-36c5-1f9e-000000000073] 9396 1727204053.37432: sending task result for task 12b410aa-8751-36c5-1f9e-000000000073 9396 1727204053.37544: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000073 9396 1727204053.37547: WORKER PROCESS EXITING ok: [managed-node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "deprecated-bond" ], "delta": "0:00:00.003661", "end": "2024-09-24 14:54:13.326817", "rc": 0, "start": "2024-09-24 14:54:13.323156" } STDOUT: 13: deprecated-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::10a/128 scope global dynamic noprefixroute valid_lft 235sec preferred_lft 235sec inet6 2001:db8::3549:a0b1:8a62:b090/64 scope global dynamic noprefixroute valid_lft 1796sec preferred_lft 1796sec inet6 fe80::937a:2635:e04f:bd89/64 scope link noprefixroute valid_lft forever preferred_lft forever 9396 1727204053.37649: no more pending results, returning what we have 9396 1727204053.37653: results queue empty 9396 1727204053.37654: checking for any_errors_fatal 9396 1727204053.37660: done checking for any_errors_fatal 9396 1727204053.37661: checking for max_fail_percentage 9396 1727204053.37663: done checking for max_fail_percentage 9396 1727204053.37664: checking to see if all hosts have failed and the running result is not ok 9396 1727204053.37665: done checking to see if all hosts have failed 9396 1727204053.37666: getting the remaining hosts for this loop 9396 1727204053.37667: done getting the remaining hosts for this loop 9396 1727204053.37672: getting the next task for host managed-node1 9396 1727204053.37683: done getting next task for host managed-node1 9396 1727204053.37696: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 9396 1727204053.37700: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204053.37721: getting variables 9396 1727204053.37722: in VariableManager get_vars() 9396 1727204053.37764: Calling all_inventory to load vars for managed-node1 9396 1727204053.37767: Calling groups_inventory to load vars for managed-node1 9396 1727204053.37769: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204053.37780: Calling all_plugins_play to load vars for managed-node1 9396 1727204053.37783: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204053.37787: Calling groups_plugins_play to load vars for managed-node1 9396 1727204053.43755: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204053.45372: done with get_vars() 9396 1727204053.45414: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.455) 0:00:29.426 ***** 9396 1727204053.45520: entering _queue_task() for managed-node1/include_tasks 9396 1727204053.45899: worker is 1 (out of 1 available) 9396 1727204053.45915: exiting _queue_task() for managed-node1/include_tasks 9396 1727204053.45929: done queuing things up, now waiting for results queue to drain 9396 1727204053.45931: waiting for pending results... 9396 1727204053.46351: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 9396 1727204053.46449: in run() - task 12b410aa-8751-36c5-1f9e-00000000007d 9396 1727204053.46461: variable 'ansible_search_path' from source: unknown 9396 1727204053.46465: variable 'ansible_search_path' from source: unknown 9396 1727204053.46502: calling self._execute() 9396 1727204053.46588: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204053.46597: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204053.46606: variable 'omit' from source: magic vars 9396 1727204053.46928: variable 'ansible_distribution_major_version' from source: facts 9396 1727204053.46940: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204053.46948: _execute() done 9396 1727204053.46951: dumping result to json 9396 1727204053.46959: done dumping result, returning 9396 1727204053.46965: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-36c5-1f9e-00000000007d] 9396 1727204053.46971: sending task result for task 12b410aa-8751-36c5-1f9e-00000000007d 9396 1727204053.47069: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000007d 9396 1727204053.47072: WORKER PROCESS EXITING 9396 1727204053.47124: no more pending results, returning what we have 9396 1727204053.47129: in VariableManager get_vars() 9396 1727204053.47179: Calling all_inventory to load vars for managed-node1 9396 1727204053.47190: Calling groups_inventory to load vars for managed-node1 9396 1727204053.47194: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204053.47205: Calling all_plugins_play to load vars for managed-node1 9396 1727204053.47209: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204053.47213: Calling groups_plugins_play to load vars for managed-node1 9396 1727204053.48401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204053.50082: done with get_vars() 9396 1727204053.50118: variable 'ansible_search_path' from source: unknown 9396 1727204053.50120: variable 'ansible_search_path' from source: unknown 9396 1727204053.50166: we have included files to process 9396 1727204053.50168: generating all_blocks data 9396 1727204053.50172: done generating all_blocks data 9396 1727204053.50178: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 9396 1727204053.50179: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 9396 1727204053.50182: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 9396 1727204053.50967: done processing included file 9396 1727204053.50970: iterating over new_blocks loaded from include file 9396 1727204053.50972: in VariableManager get_vars() 9396 1727204053.51010: done with get_vars() 9396 1727204053.51013: filtering new block on tags 9396 1727204053.51055: done filtering new block on tags 9396 1727204053.51059: in VariableManager get_vars() 9396 1727204053.51094: done with get_vars() 9396 1727204053.51096: filtering new block on tags 9396 1727204053.51159: done filtering new block on tags 9396 1727204053.51163: in VariableManager get_vars() 9396 1727204053.51198: done with get_vars() 9396 1727204053.51200: filtering new block on tags 9396 1727204053.51258: done filtering new block on tags 9396 1727204053.51262: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node1 9396 1727204053.51268: extending task lists for all hosts with included blocks 9396 1727204053.52785: done extending task lists 9396 1727204053.52787: done processing included files 9396 1727204053.52788: results queue empty 9396 1727204053.52791: checking for any_errors_fatal 9396 1727204053.52797: done checking for any_errors_fatal 9396 1727204053.52798: checking for max_fail_percentage 9396 1727204053.52800: done checking for max_fail_percentage 9396 1727204053.52801: checking to see if all hosts have failed and the running result is not ok 9396 1727204053.52802: done checking to see if all hosts have failed 9396 1727204053.52803: getting the remaining hosts for this loop 9396 1727204053.52805: done getting the remaining hosts for this loop 9396 1727204053.52811: getting the next task for host managed-node1 9396 1727204053.52817: done getting next task for host managed-node1 9396 1727204053.52821: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 9396 1727204053.52825: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204053.52838: getting variables 9396 1727204053.52840: in VariableManager get_vars() 9396 1727204053.52862: Calling all_inventory to load vars for managed-node1 9396 1727204053.52865: Calling groups_inventory to load vars for managed-node1 9396 1727204053.52868: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204053.52875: Calling all_plugins_play to load vars for managed-node1 9396 1727204053.52878: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204053.52883: Calling groups_plugins_play to load vars for managed-node1 9396 1727204053.54804: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204053.56382: done with get_vars() 9396 1727204053.56411: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.109) 0:00:29.535 ***** 9396 1727204053.56484: entering _queue_task() for managed-node1/setup 9396 1727204053.56771: worker is 1 (out of 1 available) 9396 1727204053.56786: exiting _queue_task() for managed-node1/setup 9396 1727204053.56802: done queuing things up, now waiting for results queue to drain 9396 1727204053.56804: waiting for pending results... 9396 1727204053.56993: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 9396 1727204053.57142: in run() - task 12b410aa-8751-36c5-1f9e-000000000494 9396 1727204053.57157: variable 'ansible_search_path' from source: unknown 9396 1727204053.57160: variable 'ansible_search_path' from source: unknown 9396 1727204053.57193: calling self._execute() 9396 1727204053.57279: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204053.57286: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204053.57299: variable 'omit' from source: magic vars 9396 1727204053.57625: variable 'ansible_distribution_major_version' from source: facts 9396 1727204053.57637: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204053.57831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204053.59548: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204053.59610: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204053.59643: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204053.59676: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204053.59703: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204053.59773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204053.59801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204053.59825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204053.59857: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204053.59871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204053.59922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204053.59942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204053.59963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204053.59999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204053.60015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204053.60140: variable '__network_required_facts' from source: role '' defaults 9396 1727204053.60147: variable 'ansible_facts' from source: unknown 9396 1727204053.60902: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 9396 1727204053.60906: when evaluation is False, skipping this task 9396 1727204053.60909: _execute() done 9396 1727204053.60916: dumping result to json 9396 1727204053.60920: done dumping result, returning 9396 1727204053.60928: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-36c5-1f9e-000000000494] 9396 1727204053.60934: sending task result for task 12b410aa-8751-36c5-1f9e-000000000494 9396 1727204053.61030: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000494 9396 1727204053.61032: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 9396 1727204053.61078: no more pending results, returning what we have 9396 1727204053.61082: results queue empty 9396 1727204053.61084: checking for any_errors_fatal 9396 1727204053.61085: done checking for any_errors_fatal 9396 1727204053.61086: checking for max_fail_percentage 9396 1727204053.61087: done checking for max_fail_percentage 9396 1727204053.61088: checking to see if all hosts have failed and the running result is not ok 9396 1727204053.61092: done checking to see if all hosts have failed 9396 1727204053.61093: getting the remaining hosts for this loop 9396 1727204053.61094: done getting the remaining hosts for this loop 9396 1727204053.61099: getting the next task for host managed-node1 9396 1727204053.61110: done getting next task for host managed-node1 9396 1727204053.61114: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 9396 1727204053.61120: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204053.61140: getting variables 9396 1727204053.61142: in VariableManager get_vars() 9396 1727204053.61184: Calling all_inventory to load vars for managed-node1 9396 1727204053.61188: Calling groups_inventory to load vars for managed-node1 9396 1727204053.61199: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204053.61210: Calling all_plugins_play to load vars for managed-node1 9396 1727204053.61214: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204053.61217: Calling groups_plugins_play to load vars for managed-node1 9396 1727204053.62513: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204053.64081: done with get_vars() 9396 1727204053.64104: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.077) 0:00:29.612 ***** 9396 1727204053.64193: entering _queue_task() for managed-node1/stat 9396 1727204053.64435: worker is 1 (out of 1 available) 9396 1727204053.64452: exiting _queue_task() for managed-node1/stat 9396 1727204053.64463: done queuing things up, now waiting for results queue to drain 9396 1727204053.64464: waiting for pending results... 9396 1727204053.64656: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 9396 1727204053.64783: in run() - task 12b410aa-8751-36c5-1f9e-000000000496 9396 1727204053.64797: variable 'ansible_search_path' from source: unknown 9396 1727204053.64802: variable 'ansible_search_path' from source: unknown 9396 1727204053.64839: calling self._execute() 9396 1727204053.64925: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204053.64933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204053.64944: variable 'omit' from source: magic vars 9396 1727204053.65259: variable 'ansible_distribution_major_version' from source: facts 9396 1727204053.65270: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204053.65411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204053.65635: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204053.65672: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204053.65706: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204053.65738: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204053.65845: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204053.65866: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204053.65889: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204053.65919: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204053.65991: variable '__network_is_ostree' from source: set_fact 9396 1727204053.65998: Evaluated conditional (not __network_is_ostree is defined): False 9396 1727204053.66003: when evaluation is False, skipping this task 9396 1727204053.66006: _execute() done 9396 1727204053.66014: dumping result to json 9396 1727204053.66017: done dumping result, returning 9396 1727204053.66026: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-36c5-1f9e-000000000496] 9396 1727204053.66031: sending task result for task 12b410aa-8751-36c5-1f9e-000000000496 9396 1727204053.66126: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000496 9396 1727204053.66129: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 9396 1727204053.66187: no more pending results, returning what we have 9396 1727204053.66193: results queue empty 9396 1727204053.66195: checking for any_errors_fatal 9396 1727204053.66202: done checking for any_errors_fatal 9396 1727204053.66203: checking for max_fail_percentage 9396 1727204053.66205: done checking for max_fail_percentage 9396 1727204053.66206: checking to see if all hosts have failed and the running result is not ok 9396 1727204053.66207: done checking to see if all hosts have failed 9396 1727204053.66208: getting the remaining hosts for this loop 9396 1727204053.66209: done getting the remaining hosts for this loop 9396 1727204053.66214: getting the next task for host managed-node1 9396 1727204053.66221: done getting next task for host managed-node1 9396 1727204053.66230: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 9396 1727204053.66235: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204053.66253: getting variables 9396 1727204053.66255: in VariableManager get_vars() 9396 1727204053.66297: Calling all_inventory to load vars for managed-node1 9396 1727204053.66301: Calling groups_inventory to load vars for managed-node1 9396 1727204053.66304: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204053.66313: Calling all_plugins_play to load vars for managed-node1 9396 1727204053.66317: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204053.66320: Calling groups_plugins_play to load vars for managed-node1 9396 1727204053.67496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204053.69084: done with get_vars() 9396 1727204053.69109: done getting variables 9396 1727204053.69156: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.049) 0:00:29.662 ***** 9396 1727204053.69186: entering _queue_task() for managed-node1/set_fact 9396 1727204053.69422: worker is 1 (out of 1 available) 9396 1727204053.69438: exiting _queue_task() for managed-node1/set_fact 9396 1727204053.69449: done queuing things up, now waiting for results queue to drain 9396 1727204053.69451: waiting for pending results... 9396 1727204053.69641: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 9396 1727204053.69770: in run() - task 12b410aa-8751-36c5-1f9e-000000000497 9396 1727204053.69783: variable 'ansible_search_path' from source: unknown 9396 1727204053.69787: variable 'ansible_search_path' from source: unknown 9396 1727204053.69842: calling self._execute() 9396 1727204053.69950: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204053.69954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204053.69958: variable 'omit' from source: magic vars 9396 1727204053.70398: variable 'ansible_distribution_major_version' from source: facts 9396 1727204053.70402: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204053.70605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204053.71008: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204053.71012: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204053.71015: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204053.71075: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204053.71434: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204053.71457: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204053.71483: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204053.71506: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204053.71582: variable '__network_is_ostree' from source: set_fact 9396 1727204053.71591: Evaluated conditional (not __network_is_ostree is defined): False 9396 1727204053.71595: when evaluation is False, skipping this task 9396 1727204053.71598: _execute() done 9396 1727204053.71603: dumping result to json 9396 1727204053.71606: done dumping result, returning 9396 1727204053.71618: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-36c5-1f9e-000000000497] 9396 1727204053.71623: sending task result for task 12b410aa-8751-36c5-1f9e-000000000497 9396 1727204053.71718: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000497 9396 1727204053.71721: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 9396 1727204053.71774: no more pending results, returning what we have 9396 1727204053.71779: results queue empty 9396 1727204053.71780: checking for any_errors_fatal 9396 1727204053.71785: done checking for any_errors_fatal 9396 1727204053.71786: checking for max_fail_percentage 9396 1727204053.71788: done checking for max_fail_percentage 9396 1727204053.71792: checking to see if all hosts have failed and the running result is not ok 9396 1727204053.71793: done checking to see if all hosts have failed 9396 1727204053.71794: getting the remaining hosts for this loop 9396 1727204053.71795: done getting the remaining hosts for this loop 9396 1727204053.71800: getting the next task for host managed-node1 9396 1727204053.71810: done getting next task for host managed-node1 9396 1727204053.71815: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 9396 1727204053.71820: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204053.71839: getting variables 9396 1727204053.71841: in VariableManager get_vars() 9396 1727204053.71879: Calling all_inventory to load vars for managed-node1 9396 1727204053.71882: Calling groups_inventory to load vars for managed-node1 9396 1727204053.71885: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204053.71902: Calling all_plugins_play to load vars for managed-node1 9396 1727204053.71906: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204053.71910: Calling groups_plugins_play to load vars for managed-node1 9396 1727204053.73184: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204053.75626: done with get_vars() 9396 1727204053.75662: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:54:13 -0400 (0:00:00.065) 0:00:29.728 ***** 9396 1727204053.75779: entering _queue_task() for managed-node1/service_facts 9396 1727204053.76116: worker is 1 (out of 1 available) 9396 1727204053.76130: exiting _queue_task() for managed-node1/service_facts 9396 1727204053.76145: done queuing things up, now waiting for results queue to drain 9396 1727204053.76146: waiting for pending results... 9396 1727204053.76520: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running 9396 1727204053.76724: in run() - task 12b410aa-8751-36c5-1f9e-000000000499 9396 1727204053.76729: variable 'ansible_search_path' from source: unknown 9396 1727204053.76732: variable 'ansible_search_path' from source: unknown 9396 1727204053.76734: calling self._execute() 9396 1727204053.76846: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204053.76864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204053.76886: variable 'omit' from source: magic vars 9396 1727204053.77335: variable 'ansible_distribution_major_version' from source: facts 9396 1727204053.77355: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204053.77367: variable 'omit' from source: magic vars 9396 1727204053.77485: variable 'omit' from source: magic vars 9396 1727204053.77694: variable 'omit' from source: magic vars 9396 1727204053.77698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204053.77701: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204053.77704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204053.77707: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204053.77709: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204053.77740: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204053.77749: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204053.77758: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204053.77886: Set connection var ansible_timeout to 10 9396 1727204053.77902: Set connection var ansible_shell_executable to /bin/sh 9396 1727204053.77917: Set connection var ansible_pipelining to False 9396 1727204053.77929: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204053.77946: Set connection var ansible_connection to ssh 9396 1727204053.77953: Set connection var ansible_shell_type to sh 9396 1727204053.77985: variable 'ansible_shell_executable' from source: unknown 9396 1727204053.77996: variable 'ansible_connection' from source: unknown 9396 1727204053.78005: variable 'ansible_module_compression' from source: unknown 9396 1727204053.78013: variable 'ansible_shell_type' from source: unknown 9396 1727204053.78020: variable 'ansible_shell_executable' from source: unknown 9396 1727204053.78027: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204053.78035: variable 'ansible_pipelining' from source: unknown 9396 1727204053.78046: variable 'ansible_timeout' from source: unknown 9396 1727204053.78056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204053.78286: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204053.78308: variable 'omit' from source: magic vars 9396 1727204053.78319: starting attempt loop 9396 1727204053.78372: running the handler 9396 1727204053.78376: _low_level_execute_command(): starting 9396 1727204053.78378: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204053.79113: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204053.79136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204053.79208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.79273: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204053.79294: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204053.79321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204053.79403: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204053.81165: stdout chunk (state=3): >>>/root <<< 9396 1727204053.81401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204053.81405: stdout chunk (state=3): >>><<< 9396 1727204053.81410: stderr chunk (state=3): >>><<< 9396 1727204053.81565: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204053.81569: _low_level_execute_command(): starting 9396 1727204053.81573: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018 `" && echo ansible-tmp-1727204053.8144903-11550-120082438952018="` echo /root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018 `" ) && sleep 0' 9396 1727204053.82533: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204053.82557: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 9396 1727204053.82579: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.82660: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204053.82697: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204053.84741: stdout chunk (state=3): >>>ansible-tmp-1727204053.8144903-11550-120082438952018=/root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018 <<< 9396 1727204053.84996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204053.85000: stdout chunk (state=3): >>><<< 9396 1727204053.85002: stderr chunk (state=3): >>><<< 9396 1727204053.85005: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204053.8144903-11550-120082438952018=/root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204053.85035: variable 'ansible_module_compression' from source: unknown 9396 1727204053.85088: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 9396 1727204053.85143: variable 'ansible_facts' from source: unknown 9396 1727204053.85251: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018/AnsiballZ_service_facts.py 9396 1727204053.85478: Sending initial data 9396 1727204053.85481: Sent initial data (161 bytes) 9396 1727204053.86098: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204053.86116: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204053.86207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.86238: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204053.86253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204053.86276: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204053.86428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204053.88040: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204053.88081: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204053.88136: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmp9y_bdsc_ /root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018/AnsiballZ_service_facts.py <<< 9396 1727204053.88141: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018/AnsiballZ_service_facts.py" <<< 9396 1727204053.88178: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmp9y_bdsc_" to remote "/root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018/AnsiballZ_service_facts.py" <<< 9396 1727204053.89428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204053.89441: stderr chunk (state=3): >>><<< 9396 1727204053.89455: stdout chunk (state=3): >>><<< 9396 1727204053.89484: done transferring module to remote 9396 1727204053.89505: _low_level_execute_command(): starting 9396 1727204053.89601: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018/ /root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018/AnsiballZ_service_facts.py && sleep 0' 9396 1727204053.90185: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204053.90203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204053.90220: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204053.90305: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204053.90365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204053.90386: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204053.90412: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204053.90482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204053.92553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204053.92557: stdout chunk (state=3): >>><<< 9396 1727204053.92595: stderr chunk (state=3): >>><<< 9396 1727204053.92599: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204053.92602: _low_level_execute_command(): starting 9396 1727204053.92605: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018/AnsiballZ_service_facts.py && sleep 0' 9396 1727204053.93364: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204053.93496: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204053.93500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204053.93503: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204053.93696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204053.93721: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204053.93812: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204055.96486: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name"<<< 9396 1727204055.96605: stdout chunk (state=3): >>>: "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 9396 1727204055.98520: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204055.98654: stderr chunk (state=3): >>><<< 9396 1727204055.98664: stdout chunk (state=3): >>><<< 9396 1727204055.98847: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204056.01170: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204056.01395: _low_level_execute_command(): starting 9396 1727204056.01399: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204053.8144903-11550-120082438952018/ > /dev/null 2>&1 && sleep 0' 9396 1727204056.02566: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204056.02582: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204056.02633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204056.02858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204056.02872: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204056.02958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204056.05126: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204056.05199: stderr chunk (state=3): >>><<< 9396 1727204056.05223: stdout chunk (state=3): >>><<< 9396 1727204056.05597: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204056.05601: handler run complete 9396 1727204056.05932: variable 'ansible_facts' from source: unknown 9396 1727204056.06385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204056.08587: variable 'ansible_facts' from source: unknown 9396 1727204056.09025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204056.10184: attempt loop complete, returning result 9396 1727204056.10562: _execute() done 9396 1727204056.10566: dumping result to json 9396 1727204056.10569: done dumping result, returning 9396 1727204056.10630: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-36c5-1f9e-000000000499] 9396 1727204056.10678: sending task result for task 12b410aa-8751-36c5-1f9e-000000000499 9396 1727204056.13257: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000499 9396 1727204056.13261: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 9396 1727204056.13397: no more pending results, returning what we have 9396 1727204056.13401: results queue empty 9396 1727204056.13403: checking for any_errors_fatal 9396 1727204056.13408: done checking for any_errors_fatal 9396 1727204056.13409: checking for max_fail_percentage 9396 1727204056.13411: done checking for max_fail_percentage 9396 1727204056.13412: checking to see if all hosts have failed and the running result is not ok 9396 1727204056.13413: done checking to see if all hosts have failed 9396 1727204056.13414: getting the remaining hosts for this loop 9396 1727204056.13416: done getting the remaining hosts for this loop 9396 1727204056.13420: getting the next task for host managed-node1 9396 1727204056.13428: done getting next task for host managed-node1 9396 1727204056.13432: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 9396 1727204056.13438: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204056.13451: getting variables 9396 1727204056.13453: in VariableManager get_vars() 9396 1727204056.13678: Calling all_inventory to load vars for managed-node1 9396 1727204056.13682: Calling groups_inventory to load vars for managed-node1 9396 1727204056.13686: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204056.13700: Calling all_plugins_play to load vars for managed-node1 9396 1727204056.13703: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204056.13708: Calling groups_plugins_play to load vars for managed-node1 9396 1727204056.17927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204056.23939: done with get_vars() 9396 1727204056.23987: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:54:16 -0400 (0:00:02.485) 0:00:32.214 ***** 9396 1727204056.24319: entering _queue_task() for managed-node1/package_facts 9396 1727204056.25123: worker is 1 (out of 1 available) 9396 1727204056.25140: exiting _queue_task() for managed-node1/package_facts 9396 1727204056.25154: done queuing things up, now waiting for results queue to drain 9396 1727204056.25156: waiting for pending results... 9396 1727204056.25810: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 9396 1727204056.26378: in run() - task 12b410aa-8751-36c5-1f9e-00000000049a 9396 1727204056.26383: variable 'ansible_search_path' from source: unknown 9396 1727204056.26385: variable 'ansible_search_path' from source: unknown 9396 1727204056.26387: calling self._execute() 9396 1727204056.26797: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204056.26802: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204056.26805: variable 'omit' from source: magic vars 9396 1727204056.27498: variable 'ansible_distribution_major_version' from source: facts 9396 1727204056.27577: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204056.27592: variable 'omit' from source: magic vars 9396 1727204056.27892: variable 'omit' from source: magic vars 9396 1727204056.27947: variable 'omit' from source: magic vars 9396 1727204056.28119: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204056.28160: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204056.28185: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204056.28208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204056.28313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204056.28360: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204056.28369: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204056.28442: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204056.28690: Set connection var ansible_timeout to 10 9396 1727204056.28706: Set connection var ansible_shell_executable to /bin/sh 9396 1727204056.28726: Set connection var ansible_pipelining to False 9396 1727204056.28773: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204056.28787: Set connection var ansible_connection to ssh 9396 1727204056.28798: Set connection var ansible_shell_type to sh 9396 1727204056.28905: variable 'ansible_shell_executable' from source: unknown 9396 1727204056.28918: variable 'ansible_connection' from source: unknown 9396 1727204056.28927: variable 'ansible_module_compression' from source: unknown 9396 1727204056.28935: variable 'ansible_shell_type' from source: unknown 9396 1727204056.28942: variable 'ansible_shell_executable' from source: unknown 9396 1727204056.28950: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204056.28959: variable 'ansible_pipelining' from source: unknown 9396 1727204056.28986: variable 'ansible_timeout' from source: unknown 9396 1727204056.29000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204056.29600: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204056.29605: variable 'omit' from source: magic vars 9396 1727204056.29611: starting attempt loop 9396 1727204056.29613: running the handler 9396 1727204056.29616: _low_level_execute_command(): starting 9396 1727204056.29619: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204056.31101: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204056.31161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204056.31392: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204056.33155: stdout chunk (state=3): >>>/root <<< 9396 1727204056.33262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204056.33446: stderr chunk (state=3): >>><<< 9396 1727204056.33451: stdout chunk (state=3): >>><<< 9396 1727204056.33459: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204056.33499: _low_level_execute_command(): starting 9396 1727204056.33774: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057 `" && echo ansible-tmp-1727204056.3343034-11622-118476181921057="` echo /root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057 `" ) && sleep 0' 9396 1727204056.34739: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204056.34760: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204056.34775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204056.34828: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204056.35055: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204056.35088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204056.37136: stdout chunk (state=3): >>>ansible-tmp-1727204056.3343034-11622-118476181921057=/root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057 <<< 9396 1727204056.37349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204056.38095: stderr chunk (state=3): >>><<< 9396 1727204056.38099: stdout chunk (state=3): >>><<< 9396 1727204056.38102: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204056.3343034-11622-118476181921057=/root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204056.38105: variable 'ansible_module_compression' from source: unknown 9396 1727204056.38108: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 9396 1727204056.38496: variable 'ansible_facts' from source: unknown 9396 1727204056.39130: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057/AnsiballZ_package_facts.py 9396 1727204056.39512: Sending initial data 9396 1727204056.39515: Sent initial data (161 bytes) 9396 1727204056.40703: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204056.41058: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204056.41308: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204056.41375: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204056.43085: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204056.43123: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204056.43312: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057/AnsiballZ_package_facts.py" <<< 9396 1727204056.43315: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmp9_rtp92b /root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057/AnsiballZ_package_facts.py <<< 9396 1727204056.43349: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmp9_rtp92b" to remote "/root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057/AnsiballZ_package_facts.py" <<< 9396 1727204056.47780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204056.47826: stderr chunk (state=3): >>><<< 9396 1727204056.47841: stdout chunk (state=3): >>><<< 9396 1727204056.47877: done transferring module to remote 9396 1727204056.48098: _low_level_execute_command(): starting 9396 1727204056.48103: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057/ /root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057/AnsiballZ_package_facts.py && sleep 0' 9396 1727204056.49355: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204056.49573: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204056.49593: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204056.49616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204056.49641: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204056.49703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204056.51858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204056.51862: stdout chunk (state=3): >>><<< 9396 1727204056.51864: stderr chunk (state=3): >>><<< 9396 1727204056.51883: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204056.51985: _low_level_execute_command(): starting 9396 1727204056.51988: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057/AnsiballZ_package_facts.py && sleep 0' 9396 1727204056.52949: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204056.53306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 9396 1727204056.53331: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204056.53423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204057.18146: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 9396 1727204057.18320: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 9396 1727204057.18564: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 9396 1727204057.20443: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204057.20446: stderr chunk (state=3): >>>Shared connection to 10.31.11.210 closed. <<< 9396 1727204057.20449: stdout chunk (state=3): >>><<< 9396 1727204057.20451: stderr chunk (state=3): >>><<< 9396 1727204057.20458: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204057.25871: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204057.25876: _low_level_execute_command(): starting 9396 1727204057.25879: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204056.3343034-11622-118476181921057/ > /dev/null 2>&1 && sleep 0' 9396 1727204057.26586: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204057.26611: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204057.26639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204057.26753: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204057.26774: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204057.26802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204057.26823: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204057.26914: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204057.28941: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204057.28953: stdout chunk (state=3): >>><<< 9396 1727204057.28985: stderr chunk (state=3): >>><<< 9396 1727204057.29015: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204057.29029: handler run complete 9396 1727204057.30781: variable 'ansible_facts' from source: unknown 9396 1727204057.31930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204057.35062: variable 'ansible_facts' from source: unknown 9396 1727204057.35488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204057.36577: attempt loop complete, returning result 9396 1727204057.36629: _execute() done 9396 1727204057.36633: dumping result to json 9396 1727204057.36976: done dumping result, returning 9396 1727204057.36979: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-36c5-1f9e-00000000049a] 9396 1727204057.36981: sending task result for task 12b410aa-8751-36c5-1f9e-00000000049a 9396 1727204057.39146: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000049a 9396 1727204057.39149: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 9396 1727204057.39274: no more pending results, returning what we have 9396 1727204057.39277: results queue empty 9396 1727204057.39278: checking for any_errors_fatal 9396 1727204057.39283: done checking for any_errors_fatal 9396 1727204057.39284: checking for max_fail_percentage 9396 1727204057.39286: done checking for max_fail_percentage 9396 1727204057.39287: checking to see if all hosts have failed and the running result is not ok 9396 1727204057.39288: done checking to see if all hosts have failed 9396 1727204057.39291: getting the remaining hosts for this loop 9396 1727204057.39292: done getting the remaining hosts for this loop 9396 1727204057.39297: getting the next task for host managed-node1 9396 1727204057.39305: done getting next task for host managed-node1 9396 1727204057.39309: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 9396 1727204057.39313: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204057.39327: getting variables 9396 1727204057.39329: in VariableManager get_vars() 9396 1727204057.39367: Calling all_inventory to load vars for managed-node1 9396 1727204057.39370: Calling groups_inventory to load vars for managed-node1 9396 1727204057.39374: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204057.39384: Calling all_plugins_play to load vars for managed-node1 9396 1727204057.39388: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204057.39396: Calling groups_plugins_play to load vars for managed-node1 9396 1727204057.41359: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204057.44474: done with get_vars() 9396 1727204057.44541: done getting variables 9396 1727204057.44618: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:54:17 -0400 (0:00:01.203) 0:00:33.417 ***** 9396 1727204057.44667: entering _queue_task() for managed-node1/debug 9396 1727204057.45072: worker is 1 (out of 1 available) 9396 1727204057.45087: exiting _queue_task() for managed-node1/debug 9396 1727204057.45302: done queuing things up, now waiting for results queue to drain 9396 1727204057.45305: waiting for pending results... 9396 1727204057.45435: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider 9396 1727204057.45639: in run() - task 12b410aa-8751-36c5-1f9e-00000000007e 9396 1727204057.45644: variable 'ansible_search_path' from source: unknown 9396 1727204057.45647: variable 'ansible_search_path' from source: unknown 9396 1727204057.45685: calling self._execute() 9396 1727204057.45807: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204057.45855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204057.45859: variable 'omit' from source: magic vars 9396 1727204057.46342: variable 'ansible_distribution_major_version' from source: facts 9396 1727204057.46384: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204057.46406: variable 'omit' from source: magic vars 9396 1727204057.46616: variable 'omit' from source: magic vars 9396 1727204057.46647: variable 'network_provider' from source: set_fact 9396 1727204057.46676: variable 'omit' from source: magic vars 9396 1727204057.46732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204057.46780: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204057.46813: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204057.46846: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204057.46866: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204057.46908: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204057.46918: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204057.46928: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204057.47062: Set connection var ansible_timeout to 10 9396 1727204057.47076: Set connection var ansible_shell_executable to /bin/sh 9396 1727204057.47094: Set connection var ansible_pipelining to False 9396 1727204057.47107: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204057.47119: Set connection var ansible_connection to ssh 9396 1727204057.47128: Set connection var ansible_shell_type to sh 9396 1727204057.47170: variable 'ansible_shell_executable' from source: unknown 9396 1727204057.47180: variable 'ansible_connection' from source: unknown 9396 1727204057.47267: variable 'ansible_module_compression' from source: unknown 9396 1727204057.47270: variable 'ansible_shell_type' from source: unknown 9396 1727204057.47273: variable 'ansible_shell_executable' from source: unknown 9396 1727204057.47275: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204057.47277: variable 'ansible_pipelining' from source: unknown 9396 1727204057.47279: variable 'ansible_timeout' from source: unknown 9396 1727204057.47282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204057.47405: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204057.47424: variable 'omit' from source: magic vars 9396 1727204057.47435: starting attempt loop 9396 1727204057.47442: running the handler 9396 1727204057.47503: handler run complete 9396 1727204057.47529: attempt loop complete, returning result 9396 1727204057.47536: _execute() done 9396 1727204057.47545: dumping result to json 9396 1727204057.47553: done dumping result, returning 9396 1727204057.47566: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-36c5-1f9e-00000000007e] 9396 1727204057.47576: sending task result for task 12b410aa-8751-36c5-1f9e-00000000007e ok: [managed-node1] => {} MSG: Using network provider: nm 9396 1727204057.47767: no more pending results, returning what we have 9396 1727204057.47772: results queue empty 9396 1727204057.47773: checking for any_errors_fatal 9396 1727204057.47785: done checking for any_errors_fatal 9396 1727204057.47786: checking for max_fail_percentage 9396 1727204057.47788: done checking for max_fail_percentage 9396 1727204057.47792: checking to see if all hosts have failed and the running result is not ok 9396 1727204057.47794: done checking to see if all hosts have failed 9396 1727204057.47795: getting the remaining hosts for this loop 9396 1727204057.47796: done getting the remaining hosts for this loop 9396 1727204057.47802: getting the next task for host managed-node1 9396 1727204057.47811: done getting next task for host managed-node1 9396 1727204057.47816: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 9396 1727204057.47822: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204057.47837: getting variables 9396 1727204057.47839: in VariableManager get_vars() 9396 1727204057.48094: Calling all_inventory to load vars for managed-node1 9396 1727204057.48099: Calling groups_inventory to load vars for managed-node1 9396 1727204057.48103: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204057.48116: Calling all_plugins_play to load vars for managed-node1 9396 1727204057.48121: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204057.48125: Calling groups_plugins_play to load vars for managed-node1 9396 1727204057.49036: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000007e 9396 1727204057.49040: WORKER PROCESS EXITING 9396 1727204057.50616: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204057.53532: done with get_vars() 9396 1727204057.53571: done getting variables 9396 1727204057.53644: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.090) 0:00:33.507 ***** 9396 1727204057.53688: entering _queue_task() for managed-node1/fail 9396 1727204057.54056: worker is 1 (out of 1 available) 9396 1727204057.54070: exiting _queue_task() for managed-node1/fail 9396 1727204057.54084: done queuing things up, now waiting for results queue to drain 9396 1727204057.54086: waiting for pending results... 9396 1727204057.54403: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 9396 1727204057.54587: in run() - task 12b410aa-8751-36c5-1f9e-00000000007f 9396 1727204057.54613: variable 'ansible_search_path' from source: unknown 9396 1727204057.54626: variable 'ansible_search_path' from source: unknown 9396 1727204057.54674: calling self._execute() 9396 1727204057.54792: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204057.54807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204057.54825: variable 'omit' from source: magic vars 9396 1727204057.55264: variable 'ansible_distribution_major_version' from source: facts 9396 1727204057.55289: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204057.55447: variable 'network_state' from source: role '' defaults 9396 1727204057.55464: Evaluated conditional (network_state != {}): False 9396 1727204057.55473: when evaluation is False, skipping this task 9396 1727204057.55481: _execute() done 9396 1727204057.55491: dumping result to json 9396 1727204057.55502: done dumping result, returning 9396 1727204057.55517: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-36c5-1f9e-00000000007f] 9396 1727204057.55530: sending task result for task 12b410aa-8751-36c5-1f9e-00000000007f 9396 1727204057.55693: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000007f 9396 1727204057.55696: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 9396 1727204057.55775: no more pending results, returning what we have 9396 1727204057.55781: results queue empty 9396 1727204057.55782: checking for any_errors_fatal 9396 1727204057.55791: done checking for any_errors_fatal 9396 1727204057.55792: checking for max_fail_percentage 9396 1727204057.55794: done checking for max_fail_percentage 9396 1727204057.55795: checking to see if all hosts have failed and the running result is not ok 9396 1727204057.55796: done checking to see if all hosts have failed 9396 1727204057.55798: getting the remaining hosts for this loop 9396 1727204057.55799: done getting the remaining hosts for this loop 9396 1727204057.55804: getting the next task for host managed-node1 9396 1727204057.55812: done getting next task for host managed-node1 9396 1727204057.55816: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 9396 1727204057.55822: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204057.55845: getting variables 9396 1727204057.55847: in VariableManager get_vars() 9396 1727204057.55996: Calling all_inventory to load vars for managed-node1 9396 1727204057.56000: Calling groups_inventory to load vars for managed-node1 9396 1727204057.56004: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204057.56017: Calling all_plugins_play to load vars for managed-node1 9396 1727204057.56021: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204057.56025: Calling groups_plugins_play to load vars for managed-node1 9396 1727204057.58277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204057.61164: done with get_vars() 9396 1727204057.61205: done getting variables 9396 1727204057.61268: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.076) 0:00:33.584 ***** 9396 1727204057.61310: entering _queue_task() for managed-node1/fail 9396 1727204057.62093: worker is 1 (out of 1 available) 9396 1727204057.62103: exiting _queue_task() for managed-node1/fail 9396 1727204057.62114: done queuing things up, now waiting for results queue to drain 9396 1727204057.62116: waiting for pending results... 9396 1727204057.62575: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 9396 1727204057.62648: in run() - task 12b410aa-8751-36c5-1f9e-000000000080 9396 1727204057.62687: variable 'ansible_search_path' from source: unknown 9396 1727204057.62726: variable 'ansible_search_path' from source: unknown 9396 1727204057.62829: calling self._execute() 9396 1727204057.62992: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204057.63081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204057.63123: variable 'omit' from source: magic vars 9396 1727204057.64090: variable 'ansible_distribution_major_version' from source: facts 9396 1727204057.64094: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204057.64423: variable 'network_state' from source: role '' defaults 9396 1727204057.64469: Evaluated conditional (network_state != {}): False 9396 1727204057.64689: when evaluation is False, skipping this task 9396 1727204057.64694: _execute() done 9396 1727204057.64697: dumping result to json 9396 1727204057.64699: done dumping result, returning 9396 1727204057.64702: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-36c5-1f9e-000000000080] 9396 1727204057.64704: sending task result for task 12b410aa-8751-36c5-1f9e-000000000080 9396 1727204057.64785: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000080 9396 1727204057.64788: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 9396 1727204057.65048: no more pending results, returning what we have 9396 1727204057.65053: results queue empty 9396 1727204057.65054: checking for any_errors_fatal 9396 1727204057.65066: done checking for any_errors_fatal 9396 1727204057.65067: checking for max_fail_percentage 9396 1727204057.65069: done checking for max_fail_percentage 9396 1727204057.65071: checking to see if all hosts have failed and the running result is not ok 9396 1727204057.65072: done checking to see if all hosts have failed 9396 1727204057.65073: getting the remaining hosts for this loop 9396 1727204057.65075: done getting the remaining hosts for this loop 9396 1727204057.65080: getting the next task for host managed-node1 9396 1727204057.65091: done getting next task for host managed-node1 9396 1727204057.65095: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 9396 1727204057.65101: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204057.65125: getting variables 9396 1727204057.65128: in VariableManager get_vars() 9396 1727204057.65174: Calling all_inventory to load vars for managed-node1 9396 1727204057.65177: Calling groups_inventory to load vars for managed-node1 9396 1727204057.65181: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204057.65397: Calling all_plugins_play to load vars for managed-node1 9396 1727204057.65402: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204057.65407: Calling groups_plugins_play to load vars for managed-node1 9396 1727204057.68647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204057.71540: done with get_vars() 9396 1727204057.71580: done getting variables 9396 1727204057.71653: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.103) 0:00:33.687 ***** 9396 1727204057.71697: entering _queue_task() for managed-node1/fail 9396 1727204057.72054: worker is 1 (out of 1 available) 9396 1727204057.72068: exiting _queue_task() for managed-node1/fail 9396 1727204057.72083: done queuing things up, now waiting for results queue to drain 9396 1727204057.72084: waiting for pending results... 9396 1727204057.72404: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 9396 1727204057.72896: in run() - task 12b410aa-8751-36c5-1f9e-000000000081 9396 1727204057.72902: variable 'ansible_search_path' from source: unknown 9396 1727204057.72905: variable 'ansible_search_path' from source: unknown 9396 1727204057.72908: calling self._execute() 9396 1727204057.72977: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204057.73036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204057.73152: variable 'omit' from source: magic vars 9396 1727204057.74088: variable 'ansible_distribution_major_version' from source: facts 9396 1727204057.74113: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204057.74471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204057.80455: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204057.80665: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204057.80720: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204057.80768: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204057.80809: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204057.80905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204057.80949: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204057.80983: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204057.81194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204057.81198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204057.81364: variable 'ansible_distribution_major_version' from source: facts 9396 1727204057.81563: Evaluated conditional (ansible_distribution_major_version | int > 9): True 9396 1727204057.81890: variable 'ansible_distribution' from source: facts 9396 1727204057.81895: variable '__network_rh_distros' from source: role '' defaults 9396 1727204057.81898: Evaluated conditional (ansible_distribution in __network_rh_distros): False 9396 1727204057.81901: when evaluation is False, skipping this task 9396 1727204057.81903: _execute() done 9396 1727204057.81905: dumping result to json 9396 1727204057.81907: done dumping result, returning 9396 1727204057.81910: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-36c5-1f9e-000000000081] 9396 1727204057.81913: sending task result for task 12b410aa-8751-36c5-1f9e-000000000081 9396 1727204057.82297: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000081 9396 1727204057.82300: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 9396 1727204057.82544: no more pending results, returning what we have 9396 1727204057.82548: results queue empty 9396 1727204057.82550: checking for any_errors_fatal 9396 1727204057.82557: done checking for any_errors_fatal 9396 1727204057.82558: checking for max_fail_percentage 9396 1727204057.82560: done checking for max_fail_percentage 9396 1727204057.82562: checking to see if all hosts have failed and the running result is not ok 9396 1727204057.82563: done checking to see if all hosts have failed 9396 1727204057.82564: getting the remaining hosts for this loop 9396 1727204057.82566: done getting the remaining hosts for this loop 9396 1727204057.82571: getting the next task for host managed-node1 9396 1727204057.82578: done getting next task for host managed-node1 9396 1727204057.82583: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 9396 1727204057.82588: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204057.82611: getting variables 9396 1727204057.82613: in VariableManager get_vars() 9396 1727204057.82659: Calling all_inventory to load vars for managed-node1 9396 1727204057.82663: Calling groups_inventory to load vars for managed-node1 9396 1727204057.82667: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204057.82680: Calling all_plugins_play to load vars for managed-node1 9396 1727204057.82684: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204057.82712: Calling groups_plugins_play to load vars for managed-node1 9396 1727204057.86221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204057.89130: done with get_vars() 9396 1727204057.89170: done getting variables 9396 1727204057.89245: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:54:17 -0400 (0:00:00.175) 0:00:33.863 ***** 9396 1727204057.89286: entering _queue_task() for managed-node1/dnf 9396 1727204057.89682: worker is 1 (out of 1 available) 9396 1727204057.89801: exiting _queue_task() for managed-node1/dnf 9396 1727204057.89814: done queuing things up, now waiting for results queue to drain 9396 1727204057.89816: waiting for pending results... 9396 1727204057.90037: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 9396 1727204057.90299: in run() - task 12b410aa-8751-36c5-1f9e-000000000082 9396 1727204057.90304: variable 'ansible_search_path' from source: unknown 9396 1727204057.90310: variable 'ansible_search_path' from source: unknown 9396 1727204057.90314: calling self._execute() 9396 1727204057.90364: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204057.90381: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204057.90403: variable 'omit' from source: magic vars 9396 1727204057.90848: variable 'ansible_distribution_major_version' from source: facts 9396 1727204057.90866: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204057.91132: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204057.93811: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204057.93899: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204057.93955: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204057.94005: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204057.94045: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204057.94146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204057.94255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204057.94258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204057.94287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204057.94312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204057.94467: variable 'ansible_distribution' from source: facts 9396 1727204057.94482: variable 'ansible_distribution_major_version' from source: facts 9396 1727204057.94496: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 9396 1727204057.94647: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204057.94836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204057.94871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204057.94907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204057.95024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204057.95028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204057.95132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204057.95136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204057.95139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204057.95168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204057.95191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204057.95252: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204057.95286: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204057.95325: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204057.95383: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204057.95406: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204057.95630: variable 'network_connections' from source: task vars 9396 1727204057.95648: variable 'port2_profile' from source: play vars 9396 1727204057.95730: variable 'port2_profile' from source: play vars 9396 1727204057.95747: variable 'port1_profile' from source: play vars 9396 1727204057.95829: variable 'port1_profile' from source: play vars 9396 1727204057.95844: variable 'controller_profile' from source: play vars 9396 1727204057.95927: variable 'controller_profile' from source: play vars 9396 1727204057.96196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204057.96251: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204057.96302: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204057.96349: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204057.96390: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204057.96451: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204057.96492: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204057.96532: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204057.96569: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204057.96638: variable '__network_team_connections_defined' from source: role '' defaults 9396 1727204057.96971: variable 'network_connections' from source: task vars 9396 1727204057.96983: variable 'port2_profile' from source: play vars 9396 1727204057.97061: variable 'port2_profile' from source: play vars 9396 1727204057.97080: variable 'port1_profile' from source: play vars 9396 1727204057.97150: variable 'port1_profile' from source: play vars 9396 1727204057.97163: variable 'controller_profile' from source: play vars 9396 1727204057.97244: variable 'controller_profile' from source: play vars 9396 1727204057.97278: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 9396 1727204057.97290: when evaluation is False, skipping this task 9396 1727204057.97396: _execute() done 9396 1727204057.97399: dumping result to json 9396 1727204057.97401: done dumping result, returning 9396 1727204057.97404: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-36c5-1f9e-000000000082] 9396 1727204057.97406: sending task result for task 12b410aa-8751-36c5-1f9e-000000000082 9396 1727204057.97483: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000082 9396 1727204057.97486: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 9396 1727204057.97557: no more pending results, returning what we have 9396 1727204057.97562: results queue empty 9396 1727204057.97563: checking for any_errors_fatal 9396 1727204057.97571: done checking for any_errors_fatal 9396 1727204057.97572: checking for max_fail_percentage 9396 1727204057.97574: done checking for max_fail_percentage 9396 1727204057.97575: checking to see if all hosts have failed and the running result is not ok 9396 1727204057.97577: done checking to see if all hosts have failed 9396 1727204057.97578: getting the remaining hosts for this loop 9396 1727204057.97579: done getting the remaining hosts for this loop 9396 1727204057.97585: getting the next task for host managed-node1 9396 1727204057.97598: done getting next task for host managed-node1 9396 1727204057.97603: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 9396 1727204057.97608: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204057.97631: getting variables 9396 1727204057.97633: in VariableManager get_vars() 9396 1727204057.97679: Calling all_inventory to load vars for managed-node1 9396 1727204057.97683: Calling groups_inventory to load vars for managed-node1 9396 1727204057.97686: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204057.98003: Calling all_plugins_play to load vars for managed-node1 9396 1727204057.98011: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204057.98015: Calling groups_plugins_play to load vars for managed-node1 9396 1727204058.03087: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204058.09386: done with get_vars() 9396 1727204058.09553: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 9396 1727204058.09762: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.205) 0:00:34.069 ***** 9396 1727204058.09845: entering _queue_task() for managed-node1/yum 9396 1727204058.10741: worker is 1 (out of 1 available) 9396 1727204058.10755: exiting _queue_task() for managed-node1/yum 9396 1727204058.10768: done queuing things up, now waiting for results queue to drain 9396 1727204058.10769: waiting for pending results... 9396 1727204058.11409: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 9396 1727204058.11455: in run() - task 12b410aa-8751-36c5-1f9e-000000000083 9396 1727204058.11477: variable 'ansible_search_path' from source: unknown 9396 1727204058.11795: variable 'ansible_search_path' from source: unknown 9396 1727204058.11799: calling self._execute() 9396 1727204058.11851: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204058.11868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204058.11886: variable 'omit' from source: magic vars 9396 1727204058.12740: variable 'ansible_distribution_major_version' from source: facts 9396 1727204058.12913: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204058.13304: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204058.16270: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204058.16370: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204058.16426: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204058.16481: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204058.16526: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204058.16645: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204058.16693: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204058.16726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.16787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204058.16807: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204058.17078: variable 'ansible_distribution_major_version' from source: facts 9396 1727204058.17081: Evaluated conditional (ansible_distribution_major_version | int < 8): False 9396 1727204058.17083: when evaluation is False, skipping this task 9396 1727204058.17086: _execute() done 9396 1727204058.17087: dumping result to json 9396 1727204058.17091: done dumping result, returning 9396 1727204058.17094: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-36c5-1f9e-000000000083] 9396 1727204058.17096: sending task result for task 12b410aa-8751-36c5-1f9e-000000000083 9396 1727204058.17167: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000083 9396 1727204058.17171: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 9396 1727204058.17230: no more pending results, returning what we have 9396 1727204058.17234: results queue empty 9396 1727204058.17235: checking for any_errors_fatal 9396 1727204058.17243: done checking for any_errors_fatal 9396 1727204058.17244: checking for max_fail_percentage 9396 1727204058.17246: done checking for max_fail_percentage 9396 1727204058.17247: checking to see if all hosts have failed and the running result is not ok 9396 1727204058.17248: done checking to see if all hosts have failed 9396 1727204058.17249: getting the remaining hosts for this loop 9396 1727204058.17251: done getting the remaining hosts for this loop 9396 1727204058.17255: getting the next task for host managed-node1 9396 1727204058.17263: done getting next task for host managed-node1 9396 1727204058.17267: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 9396 1727204058.17271: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204058.17291: getting variables 9396 1727204058.17293: in VariableManager get_vars() 9396 1727204058.17335: Calling all_inventory to load vars for managed-node1 9396 1727204058.17339: Calling groups_inventory to load vars for managed-node1 9396 1727204058.17342: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204058.17353: Calling all_plugins_play to load vars for managed-node1 9396 1727204058.17356: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204058.17359: Calling groups_plugins_play to load vars for managed-node1 9396 1727204058.20758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204058.23738: done with get_vars() 9396 1727204058.23787: done getting variables 9396 1727204058.23862: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.140) 0:00:34.210 ***** 9396 1727204058.23908: entering _queue_task() for managed-node1/fail 9396 1727204058.24338: worker is 1 (out of 1 available) 9396 1727204058.24353: exiting _queue_task() for managed-node1/fail 9396 1727204058.24367: done queuing things up, now waiting for results queue to drain 9396 1727204058.24369: waiting for pending results... 9396 1727204058.24725: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 9396 1727204058.24998: in run() - task 12b410aa-8751-36c5-1f9e-000000000084 9396 1727204058.25002: variable 'ansible_search_path' from source: unknown 9396 1727204058.25006: variable 'ansible_search_path' from source: unknown 9396 1727204058.25008: calling self._execute() 9396 1727204058.25076: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204058.25092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204058.25112: variable 'omit' from source: magic vars 9396 1727204058.25571: variable 'ansible_distribution_major_version' from source: facts 9396 1727204058.25595: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204058.25776: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204058.26120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204058.35399: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204058.35463: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204058.35538: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204058.35547: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204058.35579: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204058.35664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204058.35756: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204058.35760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.35780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204058.35799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204058.35862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204058.35883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204058.35910: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.35955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204058.35971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204058.36017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204058.36042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204058.36067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.36114: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204058.36129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204058.36358: variable 'network_connections' from source: task vars 9396 1727204058.36361: variable 'port2_profile' from source: play vars 9396 1727204058.36447: variable 'port2_profile' from source: play vars 9396 1727204058.36458: variable 'port1_profile' from source: play vars 9396 1727204058.36536: variable 'port1_profile' from source: play vars 9396 1727204058.36546: variable 'controller_profile' from source: play vars 9396 1727204058.36650: variable 'controller_profile' from source: play vars 9396 1727204058.36707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204058.36951: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204058.37019: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204058.37060: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204058.37092: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204058.37146: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204058.37175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204058.37208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.37246: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204058.37291: variable '__network_team_connections_defined' from source: role '' defaults 9396 1727204058.37646: variable 'network_connections' from source: task vars 9396 1727204058.37652: variable 'port2_profile' from source: play vars 9396 1727204058.37728: variable 'port2_profile' from source: play vars 9396 1727204058.37737: variable 'port1_profile' from source: play vars 9396 1727204058.37827: variable 'port1_profile' from source: play vars 9396 1727204058.37836: variable 'controller_profile' from source: play vars 9396 1727204058.37909: variable 'controller_profile' from source: play vars 9396 1727204058.37945: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 9396 1727204058.37956: when evaluation is False, skipping this task 9396 1727204058.37959: _execute() done 9396 1727204058.37962: dumping result to json 9396 1727204058.37965: done dumping result, returning 9396 1727204058.37967: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-36c5-1f9e-000000000084] 9396 1727204058.37969: sending task result for task 12b410aa-8751-36c5-1f9e-000000000084 9396 1727204058.38346: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000084 9396 1727204058.38349: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 9396 1727204058.38399: no more pending results, returning what we have 9396 1727204058.38403: results queue empty 9396 1727204058.38404: checking for any_errors_fatal 9396 1727204058.38408: done checking for any_errors_fatal 9396 1727204058.38409: checking for max_fail_percentage 9396 1727204058.38411: done checking for max_fail_percentage 9396 1727204058.38412: checking to see if all hosts have failed and the running result is not ok 9396 1727204058.38413: done checking to see if all hosts have failed 9396 1727204058.38414: getting the remaining hosts for this loop 9396 1727204058.38415: done getting the remaining hosts for this loop 9396 1727204058.38418: getting the next task for host managed-node1 9396 1727204058.38425: done getting next task for host managed-node1 9396 1727204058.38429: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 9396 1727204058.38433: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204058.38452: getting variables 9396 1727204058.38453: in VariableManager get_vars() 9396 1727204058.38497: Calling all_inventory to load vars for managed-node1 9396 1727204058.38500: Calling groups_inventory to load vars for managed-node1 9396 1727204058.38503: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204058.38513: Calling all_plugins_play to load vars for managed-node1 9396 1727204058.38516: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204058.38520: Calling groups_plugins_play to load vars for managed-node1 9396 1727204058.45675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204058.49015: done with get_vars() 9396 1727204058.49059: done getting variables 9396 1727204058.49126: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.252) 0:00:34.462 ***** 9396 1727204058.49167: entering _queue_task() for managed-node1/package 9396 1727204058.49544: worker is 1 (out of 1 available) 9396 1727204058.49556: exiting _queue_task() for managed-node1/package 9396 1727204058.49570: done queuing things up, now waiting for results queue to drain 9396 1727204058.49571: waiting for pending results... 9396 1727204058.49915: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages 9396 1727204058.50126: in run() - task 12b410aa-8751-36c5-1f9e-000000000085 9396 1727204058.50131: variable 'ansible_search_path' from source: unknown 9396 1727204058.50133: variable 'ansible_search_path' from source: unknown 9396 1727204058.50162: calling self._execute() 9396 1727204058.50282: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204058.50345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204058.50349: variable 'omit' from source: magic vars 9396 1727204058.50796: variable 'ansible_distribution_major_version' from source: facts 9396 1727204058.50820: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204058.51086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204058.51415: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204058.51478: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204058.51653: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204058.51656: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204058.51772: variable 'network_packages' from source: role '' defaults 9396 1727204058.51919: variable '__network_provider_setup' from source: role '' defaults 9396 1727204058.51936: variable '__network_service_name_default_nm' from source: role '' defaults 9396 1727204058.52024: variable '__network_service_name_default_nm' from source: role '' defaults 9396 1727204058.52039: variable '__network_packages_default_nm' from source: role '' defaults 9396 1727204058.52122: variable '__network_packages_default_nm' from source: role '' defaults 9396 1727204058.52391: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204058.54944: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204058.55059: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204058.55218: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204058.55223: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204058.55252: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204058.55370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204058.55421: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204058.55470: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.55533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204058.55572: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204058.55667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204058.55761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204058.55765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.55812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204058.55883: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204058.56345: variable '__network_packages_default_gobject_packages' from source: role '' defaults 9396 1727204058.56496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204058.56566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204058.56609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.56674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204058.56701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204058.56834: variable 'ansible_python' from source: facts 9396 1727204058.56879: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 9396 1727204058.57074: variable '__network_wpa_supplicant_required' from source: role '' defaults 9396 1727204058.57119: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 9396 1727204058.57303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204058.57354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204058.57392: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.57494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204058.57532: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204058.57583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204058.57613: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204058.57635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.57667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204058.57680: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204058.57813: variable 'network_connections' from source: task vars 9396 1727204058.57818: variable 'port2_profile' from source: play vars 9396 1727204058.57903: variable 'port2_profile' from source: play vars 9396 1727204058.57914: variable 'port1_profile' from source: play vars 9396 1727204058.57998: variable 'port1_profile' from source: play vars 9396 1727204058.58010: variable 'controller_profile' from source: play vars 9396 1727204058.58088: variable 'controller_profile' from source: play vars 9396 1727204058.58148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204058.58175: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204058.58203: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.58230: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204058.58280: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204058.58514: variable 'network_connections' from source: task vars 9396 1727204058.58518: variable 'port2_profile' from source: play vars 9396 1727204058.58595: variable 'port2_profile' from source: play vars 9396 1727204058.58612: variable 'port1_profile' from source: play vars 9396 1727204058.58686: variable 'port1_profile' from source: play vars 9396 1727204058.58697: variable 'controller_profile' from source: play vars 9396 1727204058.58779: variable 'controller_profile' from source: play vars 9396 1727204058.58810: variable '__network_packages_default_wireless' from source: role '' defaults 9396 1727204058.58877: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204058.59131: variable 'network_connections' from source: task vars 9396 1727204058.59135: variable 'port2_profile' from source: play vars 9396 1727204058.59193: variable 'port2_profile' from source: play vars 9396 1727204058.59201: variable 'port1_profile' from source: play vars 9396 1727204058.59256: variable 'port1_profile' from source: play vars 9396 1727204058.59265: variable 'controller_profile' from source: play vars 9396 1727204058.59337: variable 'controller_profile' from source: play vars 9396 1727204058.59358: variable '__network_packages_default_team' from source: role '' defaults 9396 1727204058.59527: variable '__network_team_connections_defined' from source: role '' defaults 9396 1727204058.60008: variable 'network_connections' from source: task vars 9396 1727204058.60011: variable 'port2_profile' from source: play vars 9396 1727204058.60084: variable 'port2_profile' from source: play vars 9396 1727204058.60116: variable 'port1_profile' from source: play vars 9396 1727204058.60184: variable 'port1_profile' from source: play vars 9396 1727204058.60225: variable 'controller_profile' from source: play vars 9396 1727204058.60287: variable 'controller_profile' from source: play vars 9396 1727204058.60423: variable '__network_service_name_default_initscripts' from source: role '' defaults 9396 1727204058.60449: variable '__network_service_name_default_initscripts' from source: role '' defaults 9396 1727204058.60463: variable '__network_packages_default_initscripts' from source: role '' defaults 9396 1727204058.60543: variable '__network_packages_default_initscripts' from source: role '' defaults 9396 1727204058.60792: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 9396 1727204058.61356: variable 'network_connections' from source: task vars 9396 1727204058.61361: variable 'port2_profile' from source: play vars 9396 1727204058.61416: variable 'port2_profile' from source: play vars 9396 1727204058.61425: variable 'port1_profile' from source: play vars 9396 1727204058.61476: variable 'port1_profile' from source: play vars 9396 1727204058.61484: variable 'controller_profile' from source: play vars 9396 1727204058.61537: variable 'controller_profile' from source: play vars 9396 1727204058.61547: variable 'ansible_distribution' from source: facts 9396 1727204058.61551: variable '__network_rh_distros' from source: role '' defaults 9396 1727204058.61558: variable 'ansible_distribution_major_version' from source: facts 9396 1727204058.61573: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 9396 1727204058.61716: variable 'ansible_distribution' from source: facts 9396 1727204058.61720: variable '__network_rh_distros' from source: role '' defaults 9396 1727204058.61726: variable 'ansible_distribution_major_version' from source: facts 9396 1727204058.61733: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 9396 1727204058.62000: variable 'ansible_distribution' from source: facts 9396 1727204058.62004: variable '__network_rh_distros' from source: role '' defaults 9396 1727204058.62009: variable 'ansible_distribution_major_version' from source: facts 9396 1727204058.62012: variable 'network_provider' from source: set_fact 9396 1727204058.62017: variable 'ansible_facts' from source: unknown 9396 1727204058.63696: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 9396 1727204058.63700: when evaluation is False, skipping this task 9396 1727204058.63703: _execute() done 9396 1727204058.63706: dumping result to json 9396 1727204058.63711: done dumping result, returning 9396 1727204058.63722: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-36c5-1f9e-000000000085] 9396 1727204058.63727: sending task result for task 12b410aa-8751-36c5-1f9e-000000000085 9396 1727204058.63849: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000085 9396 1727204058.63852: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 9396 1727204058.63913: no more pending results, returning what we have 9396 1727204058.63918: results queue empty 9396 1727204058.63920: checking for any_errors_fatal 9396 1727204058.63931: done checking for any_errors_fatal 9396 1727204058.63932: checking for max_fail_percentage 9396 1727204058.63934: done checking for max_fail_percentage 9396 1727204058.63935: checking to see if all hosts have failed and the running result is not ok 9396 1727204058.63936: done checking to see if all hosts have failed 9396 1727204058.63937: getting the remaining hosts for this loop 9396 1727204058.63940: done getting the remaining hosts for this loop 9396 1727204058.63950: getting the next task for host managed-node1 9396 1727204058.63956: done getting next task for host managed-node1 9396 1727204058.63963: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 9396 1727204058.63967: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204058.63986: getting variables 9396 1727204058.63988: in VariableManager get_vars() 9396 1727204058.64037: Calling all_inventory to load vars for managed-node1 9396 1727204058.64040: Calling groups_inventory to load vars for managed-node1 9396 1727204058.64043: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204058.64053: Calling all_plugins_play to load vars for managed-node1 9396 1727204058.64056: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204058.64059: Calling groups_plugins_play to load vars for managed-node1 9396 1727204058.66252: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204058.70393: done with get_vars() 9396 1727204058.70437: done getting variables 9396 1727204058.70629: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.215) 0:00:34.677 ***** 9396 1727204058.70696: entering _queue_task() for managed-node1/package 9396 1727204058.71060: worker is 1 (out of 1 available) 9396 1727204058.71073: exiting _queue_task() for managed-node1/package 9396 1727204058.71090: done queuing things up, now waiting for results queue to drain 9396 1727204058.71092: waiting for pending results... 9396 1727204058.71515: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 9396 1727204058.71635: in run() - task 12b410aa-8751-36c5-1f9e-000000000086 9396 1727204058.71644: variable 'ansible_search_path' from source: unknown 9396 1727204058.71653: variable 'ansible_search_path' from source: unknown 9396 1727204058.71706: calling self._execute() 9396 1727204058.71832: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204058.71851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204058.71873: variable 'omit' from source: magic vars 9396 1727204058.72398: variable 'ansible_distribution_major_version' from source: facts 9396 1727204058.72411: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204058.72632: variable 'network_state' from source: role '' defaults 9396 1727204058.72649: Evaluated conditional (network_state != {}): False 9396 1727204058.72657: when evaluation is False, skipping this task 9396 1727204058.72724: _execute() done 9396 1727204058.72729: dumping result to json 9396 1727204058.72732: done dumping result, returning 9396 1727204058.72735: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-36c5-1f9e-000000000086] 9396 1727204058.72737: sending task result for task 12b410aa-8751-36c5-1f9e-000000000086 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 9396 1727204058.73020: no more pending results, returning what we have 9396 1727204058.73026: results queue empty 9396 1727204058.73028: checking for any_errors_fatal 9396 1727204058.73035: done checking for any_errors_fatal 9396 1727204058.73036: checking for max_fail_percentage 9396 1727204058.73038: done checking for max_fail_percentage 9396 1727204058.73039: checking to see if all hosts have failed and the running result is not ok 9396 1727204058.73040: done checking to see if all hosts have failed 9396 1727204058.73041: getting the remaining hosts for this loop 9396 1727204058.73048: done getting the remaining hosts for this loop 9396 1727204058.73054: getting the next task for host managed-node1 9396 1727204058.73062: done getting next task for host managed-node1 9396 1727204058.73066: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 9396 1727204058.73071: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204058.73097: getting variables 9396 1727204058.73099: in VariableManager get_vars() 9396 1727204058.73296: Calling all_inventory to load vars for managed-node1 9396 1727204058.73300: Calling groups_inventory to load vars for managed-node1 9396 1727204058.73304: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204058.73314: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000086 9396 1727204058.73317: WORKER PROCESS EXITING 9396 1727204058.73328: Calling all_plugins_play to load vars for managed-node1 9396 1727204058.73332: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204058.73336: Calling groups_plugins_play to load vars for managed-node1 9396 1727204058.75832: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204058.78848: done with get_vars() 9396 1727204058.78894: done getting variables 9396 1727204058.78975: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.083) 0:00:34.761 ***** 9396 1727204058.79023: entering _queue_task() for managed-node1/package 9396 1727204058.79431: worker is 1 (out of 1 available) 9396 1727204058.79445: exiting _queue_task() for managed-node1/package 9396 1727204058.79460: done queuing things up, now waiting for results queue to drain 9396 1727204058.79462: waiting for pending results... 9396 1727204058.79777: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 9396 1727204058.79971: in run() - task 12b410aa-8751-36c5-1f9e-000000000087 9396 1727204058.79996: variable 'ansible_search_path' from source: unknown 9396 1727204058.80007: variable 'ansible_search_path' from source: unknown 9396 1727204058.80058: calling self._execute() 9396 1727204058.80185: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204058.80202: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204058.80221: variable 'omit' from source: magic vars 9396 1727204058.80684: variable 'ansible_distribution_major_version' from source: facts 9396 1727204058.80708: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204058.81086: variable 'network_state' from source: role '' defaults 9396 1727204058.81101: Evaluated conditional (network_state != {}): False 9396 1727204058.81105: when evaluation is False, skipping this task 9396 1727204058.81108: _execute() done 9396 1727204058.81117: dumping result to json 9396 1727204058.81121: done dumping result, returning 9396 1727204058.81131: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-36c5-1f9e-000000000087] 9396 1727204058.81227: sending task result for task 12b410aa-8751-36c5-1f9e-000000000087 skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 9396 1727204058.81388: no more pending results, returning what we have 9396 1727204058.81401: results queue empty 9396 1727204058.81402: checking for any_errors_fatal 9396 1727204058.81414: done checking for any_errors_fatal 9396 1727204058.81415: checking for max_fail_percentage 9396 1727204058.81417: done checking for max_fail_percentage 9396 1727204058.81418: checking to see if all hosts have failed and the running result is not ok 9396 1727204058.81419: done checking to see if all hosts have failed 9396 1727204058.81420: getting the remaining hosts for this loop 9396 1727204058.81422: done getting the remaining hosts for this loop 9396 1727204058.81427: getting the next task for host managed-node1 9396 1727204058.81436: done getting next task for host managed-node1 9396 1727204058.81498: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 9396 1727204058.81504: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204058.81527: getting variables 9396 1727204058.81529: in VariableManager get_vars() 9396 1727204058.81696: Calling all_inventory to load vars for managed-node1 9396 1727204058.81700: Calling groups_inventory to load vars for managed-node1 9396 1727204058.81703: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204058.81710: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000087 9396 1727204058.81714: WORKER PROCESS EXITING 9396 1727204058.81730: Calling all_plugins_play to load vars for managed-node1 9396 1727204058.81734: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204058.81739: Calling groups_plugins_play to load vars for managed-node1 9396 1727204058.86515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204058.89895: done with get_vars() 9396 1727204058.89943: done getting variables 9396 1727204058.90019: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:54:18 -0400 (0:00:00.110) 0:00:34.871 ***** 9396 1727204058.90064: entering _queue_task() for managed-node1/service 9396 1727204058.90464: worker is 1 (out of 1 available) 9396 1727204058.90479: exiting _queue_task() for managed-node1/service 9396 1727204058.90498: done queuing things up, now waiting for results queue to drain 9396 1727204058.90500: waiting for pending results... 9396 1727204058.90833: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 9396 1727204058.91041: in run() - task 12b410aa-8751-36c5-1f9e-000000000088 9396 1727204058.91063: variable 'ansible_search_path' from source: unknown 9396 1727204058.91073: variable 'ansible_search_path' from source: unknown 9396 1727204058.91126: calling self._execute() 9396 1727204058.91241: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204058.91256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204058.91272: variable 'omit' from source: magic vars 9396 1727204058.91724: variable 'ansible_distribution_major_version' from source: facts 9396 1727204058.91742: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204058.91884: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204058.92164: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204058.95298: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204058.95365: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204058.95415: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204058.95469: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204058.95507: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204058.95616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204058.95683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204058.95703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.95760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204058.95791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204058.95901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204058.95905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204058.95935: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.95992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204058.96024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204058.96079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204058.96194: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204058.96199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.96227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204058.96249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204058.96503: variable 'network_connections' from source: task vars 9396 1727204058.96526: variable 'port2_profile' from source: play vars 9396 1727204058.96624: variable 'port2_profile' from source: play vars 9396 1727204058.96640: variable 'port1_profile' from source: play vars 9396 1727204058.96727: variable 'port1_profile' from source: play vars 9396 1727204058.96744: variable 'controller_profile' from source: play vars 9396 1727204058.96829: variable 'controller_profile' from source: play vars 9396 1727204058.96931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204058.97169: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204058.97231: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204058.97319: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204058.97323: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204058.97373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204058.97407: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204058.97450: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204058.97486: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204058.97558: variable '__network_team_connections_defined' from source: role '' defaults 9396 1727204058.97912: variable 'network_connections' from source: task vars 9396 1727204058.97973: variable 'port2_profile' from source: play vars 9396 1727204058.98013: variable 'port2_profile' from source: play vars 9396 1727204058.98027: variable 'port1_profile' from source: play vars 9396 1727204058.98117: variable 'port1_profile' from source: play vars 9396 1727204058.98131: variable 'controller_profile' from source: play vars 9396 1727204058.98211: variable 'controller_profile' from source: play vars 9396 1727204058.98246: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 9396 1727204058.98299: when evaluation is False, skipping this task 9396 1727204058.98302: _execute() done 9396 1727204058.98304: dumping result to json 9396 1727204058.98307: done dumping result, returning 9396 1727204058.98309: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-36c5-1f9e-000000000088] 9396 1727204058.98315: sending task result for task 12b410aa-8751-36c5-1f9e-000000000088 9396 1727204058.98630: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000088 9396 1727204058.98633: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 9396 1727204058.98686: no more pending results, returning what we have 9396 1727204058.98693: results queue empty 9396 1727204058.98694: checking for any_errors_fatal 9396 1727204058.98702: done checking for any_errors_fatal 9396 1727204058.98703: checking for max_fail_percentage 9396 1727204058.98704: done checking for max_fail_percentage 9396 1727204058.98706: checking to see if all hosts have failed and the running result is not ok 9396 1727204058.98707: done checking to see if all hosts have failed 9396 1727204058.98708: getting the remaining hosts for this loop 9396 1727204058.98709: done getting the remaining hosts for this loop 9396 1727204058.98714: getting the next task for host managed-node1 9396 1727204058.98721: done getting next task for host managed-node1 9396 1727204058.98726: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 9396 1727204058.98730: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204058.98754: getting variables 9396 1727204058.98756: in VariableManager get_vars() 9396 1727204058.98803: Calling all_inventory to load vars for managed-node1 9396 1727204058.98806: Calling groups_inventory to load vars for managed-node1 9396 1727204058.98809: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204058.98821: Calling all_plugins_play to load vars for managed-node1 9396 1727204058.98826: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204058.98830: Calling groups_plugins_play to load vars for managed-node1 9396 1727204059.03374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204059.09378: done with get_vars() 9396 1727204059.09629: done getting variables 9396 1727204059.09701: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:54:19 -0400 (0:00:00.196) 0:00:35.068 ***** 9396 1727204059.09746: entering _queue_task() for managed-node1/service 9396 1727204059.10659: worker is 1 (out of 1 available) 9396 1727204059.10674: exiting _queue_task() for managed-node1/service 9396 1727204059.10688: done queuing things up, now waiting for results queue to drain 9396 1727204059.11029: waiting for pending results... 9396 1727204059.11361: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 9396 1727204059.11792: in run() - task 12b410aa-8751-36c5-1f9e-000000000089 9396 1727204059.11998: variable 'ansible_search_path' from source: unknown 9396 1727204059.12002: variable 'ansible_search_path' from source: unknown 9396 1727204059.12006: calling self._execute() 9396 1727204059.12250: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204059.12295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204059.12544: variable 'omit' from source: magic vars 9396 1727204059.13462: variable 'ansible_distribution_major_version' from source: facts 9396 1727204059.13482: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204059.13963: variable 'network_provider' from source: set_fact 9396 1727204059.14004: variable 'network_state' from source: role '' defaults 9396 1727204059.14022: Evaluated conditional (network_provider == "nm" or network_state != {}): True 9396 1727204059.14036: variable 'omit' from source: magic vars 9396 1727204059.14144: variable 'omit' from source: magic vars 9396 1727204059.14297: variable 'network_service_name' from source: role '' defaults 9396 1727204059.14301: variable 'network_service_name' from source: role '' defaults 9396 1727204059.14426: variable '__network_provider_setup' from source: role '' defaults 9396 1727204059.14439: variable '__network_service_name_default_nm' from source: role '' defaults 9396 1727204059.14526: variable '__network_service_name_default_nm' from source: role '' defaults 9396 1727204059.14542: variable '__network_packages_default_nm' from source: role '' defaults 9396 1727204059.14625: variable '__network_packages_default_nm' from source: role '' defaults 9396 1727204059.14932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204059.18262: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204059.18384: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204059.18455: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204059.18513: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204059.18556: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204059.18658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204059.18704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204059.18743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204059.18823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204059.18878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204059.18918: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204059.18959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204059.19003: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204059.19097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204059.19102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204059.19421: variable '__network_packages_default_gobject_packages' from source: role '' defaults 9396 1727204059.19628: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204059.19695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204059.19704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204059.19896: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204059.19900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204059.19996: variable 'ansible_python' from source: facts 9396 1727204059.20034: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 9396 1727204059.20410: variable '__network_wpa_supplicant_required' from source: role '' defaults 9396 1727204059.20413: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 9396 1727204059.20582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204059.20623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204059.20671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204059.20727: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204059.20795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204059.20824: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204059.20871: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204059.20961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204059.21120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204059.21173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204059.21343: variable 'network_connections' from source: task vars 9396 1727204059.21358: variable 'port2_profile' from source: play vars 9396 1727204059.21462: variable 'port2_profile' from source: play vars 9396 1727204059.21499: variable 'port1_profile' from source: play vars 9396 1727204059.21582: variable 'port1_profile' from source: play vars 9396 1727204059.21698: variable 'controller_profile' from source: play vars 9396 1727204059.21702: variable 'controller_profile' from source: play vars 9396 1727204059.21841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204059.22105: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204059.22176: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204059.22236: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204059.22298: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204059.22383: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204059.22437: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204059.22488: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204059.22552: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204059.22627: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204059.23087: variable 'network_connections' from source: task vars 9396 1727204059.23103: variable 'port2_profile' from source: play vars 9396 1727204059.23203: variable 'port2_profile' from source: play vars 9396 1727204059.23243: variable 'port1_profile' from source: play vars 9396 1727204059.23324: variable 'port1_profile' from source: play vars 9396 1727204059.23342: variable 'controller_profile' from source: play vars 9396 1727204059.23440: variable 'controller_profile' from source: play vars 9396 1727204059.23494: variable '__network_packages_default_wireless' from source: role '' defaults 9396 1727204059.23608: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204059.24021: variable 'network_connections' from source: task vars 9396 1727204059.24033: variable 'port2_profile' from source: play vars 9396 1727204059.24128: variable 'port2_profile' from source: play vars 9396 1727204059.24142: variable 'port1_profile' from source: play vars 9396 1727204059.24234: variable 'port1_profile' from source: play vars 9396 1727204059.24365: variable 'controller_profile' from source: play vars 9396 1727204059.24368: variable 'controller_profile' from source: play vars 9396 1727204059.24442: variable '__network_packages_default_team' from source: role '' defaults 9396 1727204059.24833: variable '__network_team_connections_defined' from source: role '' defaults 9396 1727204059.25570: variable 'network_connections' from source: task vars 9396 1727204059.25582: variable 'port2_profile' from source: play vars 9396 1727204059.25752: variable 'port2_profile' from source: play vars 9396 1727204059.25953: variable 'port1_profile' from source: play vars 9396 1727204059.25956: variable 'port1_profile' from source: play vars 9396 1727204059.25959: variable 'controller_profile' from source: play vars 9396 1727204059.26149: variable 'controller_profile' from source: play vars 9396 1727204059.26349: variable '__network_service_name_default_initscripts' from source: role '' defaults 9396 1727204059.26449: variable '__network_service_name_default_initscripts' from source: role '' defaults 9396 1727204059.26471: variable '__network_packages_default_initscripts' from source: role '' defaults 9396 1727204059.26606: variable '__network_packages_default_initscripts' from source: role '' defaults 9396 1727204059.26964: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 9396 1727204059.28196: variable 'network_connections' from source: task vars 9396 1727204059.28251: variable 'port2_profile' from source: play vars 9396 1727204059.28459: variable 'port2_profile' from source: play vars 9396 1727204059.28464: variable 'port1_profile' from source: play vars 9396 1727204059.28546: variable 'port1_profile' from source: play vars 9396 1727204059.28659: variable 'controller_profile' from source: play vars 9396 1727204059.28752: variable 'controller_profile' from source: play vars 9396 1727204059.28924: variable 'ansible_distribution' from source: facts 9396 1727204059.28941: variable '__network_rh_distros' from source: role '' defaults 9396 1727204059.28953: variable 'ansible_distribution_major_version' from source: facts 9396 1727204059.28977: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 9396 1727204059.29246: variable 'ansible_distribution' from source: facts 9396 1727204059.29256: variable '__network_rh_distros' from source: role '' defaults 9396 1727204059.29267: variable 'ansible_distribution_major_version' from source: facts 9396 1727204059.29281: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 9396 1727204059.29532: variable 'ansible_distribution' from source: facts 9396 1727204059.29544: variable '__network_rh_distros' from source: role '' defaults 9396 1727204059.29560: variable 'ansible_distribution_major_version' from source: facts 9396 1727204059.29615: variable 'network_provider' from source: set_fact 9396 1727204059.29651: variable 'omit' from source: magic vars 9396 1727204059.29773: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204059.29777: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204059.29780: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204059.29795: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204059.29816: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204059.29856: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204059.29866: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204059.29880: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204059.30030: Set connection var ansible_timeout to 10 9396 1727204059.30044: Set connection var ansible_shell_executable to /bin/sh 9396 1727204059.30059: Set connection var ansible_pipelining to False 9396 1727204059.30071: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204059.30082: Set connection var ansible_connection to ssh 9396 1727204059.30094: Set connection var ansible_shell_type to sh 9396 1727204059.30136: variable 'ansible_shell_executable' from source: unknown 9396 1727204059.30194: variable 'ansible_connection' from source: unknown 9396 1727204059.30198: variable 'ansible_module_compression' from source: unknown 9396 1727204059.30204: variable 'ansible_shell_type' from source: unknown 9396 1727204059.30211: variable 'ansible_shell_executable' from source: unknown 9396 1727204059.30214: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204059.30217: variable 'ansible_pipelining' from source: unknown 9396 1727204059.30220: variable 'ansible_timeout' from source: unknown 9396 1727204059.30223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204059.30347: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204059.30367: variable 'omit' from source: magic vars 9396 1727204059.30379: starting attempt loop 9396 1727204059.30424: running the handler 9396 1727204059.30502: variable 'ansible_facts' from source: unknown 9396 1727204059.32781: _low_level_execute_command(): starting 9396 1727204059.32815: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204059.33804: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204059.33861: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204059.33888: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204059.33906: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204059.33966: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204059.35780: stdout chunk (state=3): >>>/root <<< 9396 1727204059.35945: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204059.36004: stderr chunk (state=3): >>><<< 9396 1727204059.36016: stdout chunk (state=3): >>><<< 9396 1727204059.36045: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204059.36056: _low_level_execute_command(): starting 9396 1727204059.36060: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838 `" && echo ansible-tmp-1727204059.3603902-11714-8387645184838="` echo /root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838 `" ) && sleep 0' 9396 1727204059.36802: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204059.36822: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204059.36841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204059.36885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204059.37003: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204059.37012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204059.37135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204059.37140: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204059.37187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204059.39274: stdout chunk (state=3): >>>ansible-tmp-1727204059.3603902-11714-8387645184838=/root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838 <<< 9396 1727204059.39595: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204059.39599: stdout chunk (state=3): >>><<< 9396 1727204059.39602: stderr chunk (state=3): >>><<< 9396 1727204059.39605: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204059.3603902-11714-8387645184838=/root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204059.39611: variable 'ansible_module_compression' from source: unknown 9396 1727204059.39658: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 9396 1727204059.39728: variable 'ansible_facts' from source: unknown 9396 1727204059.39979: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838/AnsiballZ_systemd.py 9396 1727204059.40201: Sending initial data 9396 1727204059.40204: Sent initial data (153 bytes) 9396 1727204059.40900: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204059.40914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204059.40962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204059.41058: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204059.41084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204059.41195: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204059.42943: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204059.42982: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204059.43068: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmphr1agwfx /root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838/AnsiballZ_systemd.py <<< 9396 1727204059.43071: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838/AnsiballZ_systemd.py" <<< 9396 1727204059.43116: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmphr1agwfx" to remote "/root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838/AnsiballZ_systemd.py" <<< 9396 1727204059.45697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204059.45776: stderr chunk (state=3): >>><<< 9396 1727204059.45786: stdout chunk (state=3): >>><<< 9396 1727204059.45833: done transferring module to remote 9396 1727204059.45852: _low_level_execute_command(): starting 9396 1727204059.45862: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838/ /root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838/AnsiballZ_systemd.py && sleep 0' 9396 1727204059.46687: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204059.46720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204059.46739: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204059.46760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204059.46842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204059.48925: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204059.48929: stdout chunk (state=3): >>><<< 9396 1727204059.48932: stderr chunk (state=3): >>><<< 9396 1727204059.48956: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204059.48965: _low_level_execute_command(): starting 9396 1727204059.48975: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838/AnsiballZ_systemd.py && sleep 0' 9396 1727204059.49625: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204059.49641: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204059.49659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204059.49746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204059.49785: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204059.49806: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204059.49827: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204059.49953: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204059.83451: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11964416", "MemoryAvailable": "infinity", "CPUUsageNSec": "558735000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "inf<<< 9396 1727204059.83590: stdout chunk (state=3): >>>inity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "lo<<< 9396 1727204059.83611: stdout chunk (state=3): >>>aded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 9396 1727204059.85659: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204059.85722: stderr chunk (state=3): >>><<< 9396 1727204059.85726: stdout chunk (state=3): >>><<< 9396 1727204059.85744: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "651", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ExecMainStartTimestampMonotonic": "17567139", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "651", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "3378", "MemoryCurrent": "11964416", "MemoryAvailable": "infinity", "CPUUsageNSec": "558735000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "system.slice sysinit.target dbus.socket", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "network.service multi-user.target cloud-init.service NetworkManager-wait-online.service network.target shutdown.target", "After": "dbus-broker.service system.slice dbus.socket systemd-journald.socket basic.target sysinit.target network-pre.target cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:53:51 EDT", "StateChangeTimestampMonotonic": "521403753", "InactiveExitTimestamp": "Tue 2024-09-24 14:45:27 EDT", "InactiveExitTimestampMonotonic": "17567399", "ActiveEnterTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ActiveEnterTimestampMonotonic": "18019295", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:45:27 EDT", "ConditionTimestampMonotonic": "17554557", "AssertTimestamp": "Tue 2024-09-24 14:45:27 EDT", "AssertTimestampMonotonic": "17554559", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ac0fd3fc06b14ac59a7d5e4a43cc5865", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204059.85916: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204059.85937: _low_level_execute_command(): starting 9396 1727204059.85943: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204059.3603902-11714-8387645184838/ > /dev/null 2>&1 && sleep 0' 9396 1727204059.86432: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204059.86436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204059.86438: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204059.86440: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204059.86443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204059.86502: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204059.86508: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204059.86552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204059.88583: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204059.88587: stdout chunk (state=3): >>><<< 9396 1727204059.88603: stderr chunk (state=3): >>><<< 9396 1727204059.88623: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204059.88696: handler run complete 9396 1727204059.88736: attempt loop complete, returning result 9396 1727204059.88748: _execute() done 9396 1727204059.88757: dumping result to json 9396 1727204059.88786: done dumping result, returning 9396 1727204059.88811: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-36c5-1f9e-000000000089] 9396 1727204059.88824: sending task result for task 12b410aa-8751-36c5-1f9e-000000000089 9396 1727204059.89501: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000089 9396 1727204059.89505: WORKER PROCESS EXITING ok: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 9396 1727204059.89565: no more pending results, returning what we have 9396 1727204059.89570: results queue empty 9396 1727204059.89571: checking for any_errors_fatal 9396 1727204059.89577: done checking for any_errors_fatal 9396 1727204059.89578: checking for max_fail_percentage 9396 1727204059.89580: done checking for max_fail_percentage 9396 1727204059.89581: checking to see if all hosts have failed and the running result is not ok 9396 1727204059.89582: done checking to see if all hosts have failed 9396 1727204059.89583: getting the remaining hosts for this loop 9396 1727204059.89585: done getting the remaining hosts for this loop 9396 1727204059.89735: getting the next task for host managed-node1 9396 1727204059.89744: done getting next task for host managed-node1 9396 1727204059.89749: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 9396 1727204059.89754: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204059.89769: getting variables 9396 1727204059.89771: in VariableManager get_vars() 9396 1727204059.89815: Calling all_inventory to load vars for managed-node1 9396 1727204059.89820: Calling groups_inventory to load vars for managed-node1 9396 1727204059.89823: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204059.89834: Calling all_plugins_play to load vars for managed-node1 9396 1727204059.89837: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204059.89850: Calling groups_plugins_play to load vars for managed-node1 9396 1727204059.91629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204059.93287: done with get_vars() 9396 1727204059.93312: done getting variables 9396 1727204059.93365: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:54:19 -0400 (0:00:00.836) 0:00:35.904 ***** 9396 1727204059.93396: entering _queue_task() for managed-node1/service 9396 1727204059.93655: worker is 1 (out of 1 available) 9396 1727204059.93670: exiting _queue_task() for managed-node1/service 9396 1727204059.93684: done queuing things up, now waiting for results queue to drain 9396 1727204059.93686: waiting for pending results... 9396 1727204059.93882: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 9396 1727204059.94008: in run() - task 12b410aa-8751-36c5-1f9e-00000000008a 9396 1727204059.94029: variable 'ansible_search_path' from source: unknown 9396 1727204059.94033: variable 'ansible_search_path' from source: unknown 9396 1727204059.94061: calling self._execute() 9396 1727204059.94147: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204059.94160: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204059.94172: variable 'omit' from source: magic vars 9396 1727204059.94502: variable 'ansible_distribution_major_version' from source: facts 9396 1727204059.94514: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204059.94616: variable 'network_provider' from source: set_fact 9396 1727204059.94620: Evaluated conditional (network_provider == "nm"): True 9396 1727204059.94701: variable '__network_wpa_supplicant_required' from source: role '' defaults 9396 1727204059.94774: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 9396 1727204059.94930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204059.96578: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204059.96658: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204059.96684: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204059.96794: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204059.96798: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204059.96872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204059.96919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204059.96957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204059.97017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204059.97041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204059.97106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204059.97144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204059.97181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204059.97396: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204059.97400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204059.97402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204059.97404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204059.97407: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204059.97428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204059.97449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204059.97633: variable 'network_connections' from source: task vars 9396 1727204059.97654: variable 'port2_profile' from source: play vars 9396 1727204059.97737: variable 'port2_profile' from source: play vars 9396 1727204059.97756: variable 'port1_profile' from source: play vars 9396 1727204059.97836: variable 'port1_profile' from source: play vars 9396 1727204059.97852: variable 'controller_profile' from source: play vars 9396 1727204059.97929: variable 'controller_profile' from source: play vars 9396 1727204059.98023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 9396 1727204059.98162: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 9396 1727204059.98200: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 9396 1727204059.98231: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 9396 1727204059.98255: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 9396 1727204059.98302: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 9396 1727204059.98316: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 9396 1727204059.98337: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204059.98359: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 9396 1727204059.98414: variable '__network_wireless_connections_defined' from source: role '' defaults 9396 1727204059.98627: variable 'network_connections' from source: task vars 9396 1727204059.98633: variable 'port2_profile' from source: play vars 9396 1727204059.98681: variable 'port2_profile' from source: play vars 9396 1727204059.98690: variable 'port1_profile' from source: play vars 9396 1727204059.98744: variable 'port1_profile' from source: play vars 9396 1727204059.98756: variable 'controller_profile' from source: play vars 9396 1727204059.98805: variable 'controller_profile' from source: play vars 9396 1727204059.98832: Evaluated conditional (__network_wpa_supplicant_required): False 9396 1727204059.98838: when evaluation is False, skipping this task 9396 1727204059.98841: _execute() done 9396 1727204059.98844: dumping result to json 9396 1727204059.98846: done dumping result, returning 9396 1727204059.98857: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-36c5-1f9e-00000000008a] 9396 1727204059.98862: sending task result for task 12b410aa-8751-36c5-1f9e-00000000008a 9396 1727204059.98960: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000008a 9396 1727204059.98963: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 9396 1727204059.99017: no more pending results, returning what we have 9396 1727204059.99022: results queue empty 9396 1727204059.99023: checking for any_errors_fatal 9396 1727204059.99050: done checking for any_errors_fatal 9396 1727204059.99051: checking for max_fail_percentage 9396 1727204059.99054: done checking for max_fail_percentage 9396 1727204059.99055: checking to see if all hosts have failed and the running result is not ok 9396 1727204059.99056: done checking to see if all hosts have failed 9396 1727204059.99057: getting the remaining hosts for this loop 9396 1727204059.99058: done getting the remaining hosts for this loop 9396 1727204059.99062: getting the next task for host managed-node1 9396 1727204059.99069: done getting next task for host managed-node1 9396 1727204059.99074: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 9396 1727204059.99078: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204059.99099: getting variables 9396 1727204059.99101: in VariableManager get_vars() 9396 1727204059.99146: Calling all_inventory to load vars for managed-node1 9396 1727204059.99149: Calling groups_inventory to load vars for managed-node1 9396 1727204059.99152: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204059.99162: Calling all_plugins_play to load vars for managed-node1 9396 1727204059.99165: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204059.99169: Calling groups_plugins_play to load vars for managed-node1 9396 1727204060.01016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204060.04004: done with get_vars() 9396 1727204060.04051: done getting variables 9396 1727204060.04126: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.107) 0:00:36.012 ***** 9396 1727204060.04168: entering _queue_task() for managed-node1/service 9396 1727204060.04543: worker is 1 (out of 1 available) 9396 1727204060.04559: exiting _queue_task() for managed-node1/service 9396 1727204060.04574: done queuing things up, now waiting for results queue to drain 9396 1727204060.04576: waiting for pending results... 9396 1727204060.05008: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service 9396 1727204060.05064: in run() - task 12b410aa-8751-36c5-1f9e-00000000008b 9396 1727204060.05086: variable 'ansible_search_path' from source: unknown 9396 1727204060.05098: variable 'ansible_search_path' from source: unknown 9396 1727204060.05148: calling self._execute() 9396 1727204060.05263: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204060.05278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204060.05295: variable 'omit' from source: magic vars 9396 1727204060.05766: variable 'ansible_distribution_major_version' from source: facts 9396 1727204060.05799: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204060.05974: variable 'network_provider' from source: set_fact 9396 1727204060.06095: Evaluated conditional (network_provider == "initscripts"): False 9396 1727204060.06099: when evaluation is False, skipping this task 9396 1727204060.06103: _execute() done 9396 1727204060.06105: dumping result to json 9396 1727204060.06108: done dumping result, returning 9396 1727204060.06111: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-36c5-1f9e-00000000008b] 9396 1727204060.06113: sending task result for task 12b410aa-8751-36c5-1f9e-00000000008b 9396 1727204060.06197: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000008b 9396 1727204060.06200: WORKER PROCESS EXITING skipping: [managed-node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 9396 1727204060.06256: no more pending results, returning what we have 9396 1727204060.06262: results queue empty 9396 1727204060.06263: checking for any_errors_fatal 9396 1727204060.06275: done checking for any_errors_fatal 9396 1727204060.06276: checking for max_fail_percentage 9396 1727204060.06279: done checking for max_fail_percentage 9396 1727204060.06280: checking to see if all hosts have failed and the running result is not ok 9396 1727204060.06282: done checking to see if all hosts have failed 9396 1727204060.06283: getting the remaining hosts for this loop 9396 1727204060.06284: done getting the remaining hosts for this loop 9396 1727204060.06292: getting the next task for host managed-node1 9396 1727204060.06302: done getting next task for host managed-node1 9396 1727204060.06307: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 9396 1727204060.06313: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204060.06341: getting variables 9396 1727204060.06343: in VariableManager get_vars() 9396 1727204060.06597: Calling all_inventory to load vars for managed-node1 9396 1727204060.06600: Calling groups_inventory to load vars for managed-node1 9396 1727204060.06604: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204060.06616: Calling all_plugins_play to load vars for managed-node1 9396 1727204060.06620: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204060.06624: Calling groups_plugins_play to load vars for managed-node1 9396 1727204060.08843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204060.11754: done with get_vars() 9396 1727204060.11807: done getting variables 9396 1727204060.11885: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.077) 0:00:36.090 ***** 9396 1727204060.11934: entering _queue_task() for managed-node1/copy 9396 1727204060.12522: worker is 1 (out of 1 available) 9396 1727204060.12534: exiting _queue_task() for managed-node1/copy 9396 1727204060.12545: done queuing things up, now waiting for results queue to drain 9396 1727204060.12547: waiting for pending results... 9396 1727204060.12791: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 9396 1727204060.12993: in run() - task 12b410aa-8751-36c5-1f9e-00000000008c 9396 1727204060.13000: variable 'ansible_search_path' from source: unknown 9396 1727204060.13003: variable 'ansible_search_path' from source: unknown 9396 1727204060.13006: calling self._execute() 9396 1727204060.13094: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204060.13195: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204060.13199: variable 'omit' from source: magic vars 9396 1727204060.13577: variable 'ansible_distribution_major_version' from source: facts 9396 1727204060.13599: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204060.13764: variable 'network_provider' from source: set_fact 9396 1727204060.13776: Evaluated conditional (network_provider == "initscripts"): False 9396 1727204060.13784: when evaluation is False, skipping this task 9396 1727204060.13795: _execute() done 9396 1727204060.13804: dumping result to json 9396 1727204060.13813: done dumping result, returning 9396 1727204060.13825: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-36c5-1f9e-00000000008c] 9396 1727204060.13835: sending task result for task 12b410aa-8751-36c5-1f9e-00000000008c skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 9396 1727204060.14147: no more pending results, returning what we have 9396 1727204060.14152: results queue empty 9396 1727204060.14154: checking for any_errors_fatal 9396 1727204060.14161: done checking for any_errors_fatal 9396 1727204060.14163: checking for max_fail_percentage 9396 1727204060.14165: done checking for max_fail_percentage 9396 1727204060.14167: checking to see if all hosts have failed and the running result is not ok 9396 1727204060.14168: done checking to see if all hosts have failed 9396 1727204060.14169: getting the remaining hosts for this loop 9396 1727204060.14170: done getting the remaining hosts for this loop 9396 1727204060.14175: getting the next task for host managed-node1 9396 1727204060.14183: done getting next task for host managed-node1 9396 1727204060.14188: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 9396 1727204060.14195: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204060.14219: getting variables 9396 1727204060.14221: in VariableManager get_vars() 9396 1727204060.14267: Calling all_inventory to load vars for managed-node1 9396 1727204060.14271: Calling groups_inventory to load vars for managed-node1 9396 1727204060.14274: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204060.14458: Calling all_plugins_play to load vars for managed-node1 9396 1727204060.14464: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204060.14470: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000008c 9396 1727204060.14473: WORKER PROCESS EXITING 9396 1727204060.14477: Calling groups_plugins_play to load vars for managed-node1 9396 1727204060.16559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204060.19628: done with get_vars() 9396 1727204060.19660: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:54:20 -0400 (0:00:00.078) 0:00:36.168 ***** 9396 1727204060.19764: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 9396 1727204060.20115: worker is 1 (out of 1 available) 9396 1727204060.20130: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_connections 9396 1727204060.20144: done queuing things up, now waiting for results queue to drain 9396 1727204060.20146: waiting for pending results... 9396 1727204060.20448: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 9396 1727204060.20636: in run() - task 12b410aa-8751-36c5-1f9e-00000000008d 9396 1727204060.20658: variable 'ansible_search_path' from source: unknown 9396 1727204060.20668: variable 'ansible_search_path' from source: unknown 9396 1727204060.20718: calling self._execute() 9396 1727204060.20837: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204060.20852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204060.20870: variable 'omit' from source: magic vars 9396 1727204060.21315: variable 'ansible_distribution_major_version' from source: facts 9396 1727204060.21335: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204060.21348: variable 'omit' from source: magic vars 9396 1727204060.21444: variable 'omit' from source: magic vars 9396 1727204060.21655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 9396 1727204060.24256: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 9396 1727204060.24442: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 9396 1727204060.24446: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 9396 1727204060.24449: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 9396 1727204060.24486: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 9396 1727204060.24601: variable 'network_provider' from source: set_fact 9396 1727204060.24774: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 9396 1727204060.24833: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 9396 1727204060.24872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 9396 1727204060.24932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 9396 1727204060.24953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 9396 1727204060.25050: variable 'omit' from source: magic vars 9396 1727204060.25199: variable 'omit' from source: magic vars 9396 1727204060.25340: variable 'network_connections' from source: task vars 9396 1727204060.25359: variable 'port2_profile' from source: play vars 9396 1727204060.25441: variable 'port2_profile' from source: play vars 9396 1727204060.25456: variable 'port1_profile' from source: play vars 9396 1727204060.25643: variable 'port1_profile' from source: play vars 9396 1727204060.25647: variable 'controller_profile' from source: play vars 9396 1727204060.25649: variable 'controller_profile' from source: play vars 9396 1727204060.25831: variable 'omit' from source: magic vars 9396 1727204060.25846: variable '__lsr_ansible_managed' from source: task vars 9396 1727204060.25927: variable '__lsr_ansible_managed' from source: task vars 9396 1727204060.26157: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 9396 1727204060.26450: Loaded config def from plugin (lookup/template) 9396 1727204060.26461: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 9396 1727204060.26497: File lookup term: get_ansible_managed.j2 9396 1727204060.26505: variable 'ansible_search_path' from source: unknown 9396 1727204060.26521: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 9396 1727204060.26542: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 9396 1727204060.26566: variable 'ansible_search_path' from source: unknown 9396 1727204060.40858: variable 'ansible_managed' from source: unknown 9396 1727204060.41103: variable 'omit' from source: magic vars 9396 1727204060.41141: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204060.41176: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204060.41207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204060.41233: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204060.41250: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204060.41307: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204060.41310: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204060.41313: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204060.41435: Set connection var ansible_timeout to 10 9396 1727204060.41493: Set connection var ansible_shell_executable to /bin/sh 9396 1727204060.41496: Set connection var ansible_pipelining to False 9396 1727204060.41499: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204060.41501: Set connection var ansible_connection to ssh 9396 1727204060.41503: Set connection var ansible_shell_type to sh 9396 1727204060.41525: variable 'ansible_shell_executable' from source: unknown 9396 1727204060.41533: variable 'ansible_connection' from source: unknown 9396 1727204060.41541: variable 'ansible_module_compression' from source: unknown 9396 1727204060.41549: variable 'ansible_shell_type' from source: unknown 9396 1727204060.41556: variable 'ansible_shell_executable' from source: unknown 9396 1727204060.41562: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204060.41571: variable 'ansible_pipelining' from source: unknown 9396 1727204060.41578: variable 'ansible_timeout' from source: unknown 9396 1727204060.41632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204060.41764: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204060.41783: variable 'omit' from source: magic vars 9396 1727204060.41798: starting attempt loop 9396 1727204060.41806: running the handler 9396 1727204060.41849: _low_level_execute_command(): starting 9396 1727204060.41853: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204060.42573: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204060.42587: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204060.42715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204060.42735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204060.43073: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204060.43130: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204060.44946: stdout chunk (state=3): >>>/root <<< 9396 1727204060.45238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204060.45242: stdout chunk (state=3): >>><<< 9396 1727204060.45244: stderr chunk (state=3): >>><<< 9396 1727204060.45247: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204060.45249: _low_level_execute_command(): starting 9396 1727204060.45580: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171 `" && echo ansible-tmp-1727204060.4521356-11755-210051153424171="` echo /root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171 `" ) && sleep 0' 9396 1727204060.46905: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204060.47054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204060.47061: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204060.47121: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204060.47322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204060.49419: stdout chunk (state=3): >>>ansible-tmp-1727204060.4521356-11755-210051153424171=/root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171 <<< 9396 1727204060.49529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204060.49704: stderr chunk (state=3): >>><<< 9396 1727204060.49708: stdout chunk (state=3): >>><<< 9396 1727204060.49734: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204060.4521356-11755-210051153424171=/root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204060.49788: variable 'ansible_module_compression' from source: unknown 9396 1727204060.49842: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 9396 1727204060.49875: variable 'ansible_facts' from source: unknown 9396 1727204060.50196: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171/AnsiballZ_network_connections.py 9396 1727204060.50562: Sending initial data 9396 1727204060.50566: Sent initial data (167 bytes) 9396 1727204060.51834: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204060.51845: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204060.51858: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204060.51875: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204060.51888: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204060.51898: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204060.51909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204060.51928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204060.51939: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 9396 1727204060.51944: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 9396 1727204060.52318: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204060.52357: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204060.54211: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204060.54246: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204060.54296: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpinl3amgd /root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171/AnsiballZ_network_connections.py <<< 9396 1727204060.54299: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171/AnsiballZ_network_connections.py" <<< 9396 1727204060.54351: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpinl3amgd" to remote "/root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171/AnsiballZ_network_connections.py" <<< 9396 1727204060.57773: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204060.57777: stderr chunk (state=3): >>><<< 9396 1727204060.57779: stdout chunk (state=3): >>><<< 9396 1727204060.57781: done transferring module to remote 9396 1727204060.57784: _low_level_execute_command(): starting 9396 1727204060.57786: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171/ /root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171/AnsiballZ_network_connections.py && sleep 0' 9396 1727204060.58900: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204060.58961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204060.59008: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204060.59216: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204060.59405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204060.61294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204060.61333: stderr chunk (state=3): >>><<< 9396 1727204060.61342: stdout chunk (state=3): >>><<< 9396 1727204060.61411: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204060.61422: _low_level_execute_command(): starting 9396 1727204060.61432: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171/AnsiballZ_network_connections.py && sleep 0' 9396 1727204060.62632: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204060.62636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204060.62639: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204060.62641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204060.62782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204060.62904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204060.62999: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204061.23753: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a1vmku04/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a1vmku04/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/786a4cc1-8289-45a7-9bfe-8d14a2de36f1: error=unknown <<< 9396 1727204061.25619: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a1vmku04/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a1vmku04/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/720e7a7e-de4c-4117-856c-30dd1c763bd3: error=unknown <<< 9396 1727204061.27373: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a1vmku04/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a1vmku04/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/7ec0abe6-17a3-4aaf-8c88-591c4827c290: error=unknown <<< 9396 1727204061.27724: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 9396 1727204061.29671: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204061.29741: stderr chunk (state=3): >>><<< 9396 1727204061.29744: stdout chunk (state=3): >>><<< 9396 1727204061.29764: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a1vmku04/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a1vmku04/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/786a4cc1-8289-45a7-9bfe-8d14a2de36f1: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a1vmku04/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a1vmku04/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/720e7a7e-de4c-4117-856c-30dd1c763bd3: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a1vmku04/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_a1vmku04/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/7ec0abe6-17a3-4aaf-8c88-591c4827c290: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204061.29820: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204061.29830: _low_level_execute_command(): starting 9396 1727204061.29836: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204060.4521356-11755-210051153424171/ > /dev/null 2>&1 && sleep 0' 9396 1727204061.30352: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204061.30356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204061.30359: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204061.30361: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204061.30364: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204061.30416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204061.30420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204061.30472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204061.32456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204061.32505: stderr chunk (state=3): >>><<< 9396 1727204061.32508: stdout chunk (state=3): >>><<< 9396 1727204061.32525: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204061.32532: handler run complete 9396 1727204061.32562: attempt loop complete, returning result 9396 1727204061.32565: _execute() done 9396 1727204061.32571: dumping result to json 9396 1727204061.32579: done dumping result, returning 9396 1727204061.32591: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-36c5-1f9e-00000000008d] 9396 1727204061.32596: sending task result for task 12b410aa-8751-36c5-1f9e-00000000008d 9396 1727204061.32716: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000008d 9396 1727204061.32719: WORKER PROCESS EXITING changed: [managed-node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 9396 1727204061.32861: no more pending results, returning what we have 9396 1727204061.32865: results queue empty 9396 1727204061.32866: checking for any_errors_fatal 9396 1727204061.32873: done checking for any_errors_fatal 9396 1727204061.32873: checking for max_fail_percentage 9396 1727204061.32875: done checking for max_fail_percentage 9396 1727204061.32876: checking to see if all hosts have failed and the running result is not ok 9396 1727204061.32877: done checking to see if all hosts have failed 9396 1727204061.32878: getting the remaining hosts for this loop 9396 1727204061.32880: done getting the remaining hosts for this loop 9396 1727204061.32884: getting the next task for host managed-node1 9396 1727204061.32898: done getting next task for host managed-node1 9396 1727204061.32902: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 9396 1727204061.32906: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204061.32919: getting variables 9396 1727204061.32921: in VariableManager get_vars() 9396 1727204061.32961: Calling all_inventory to load vars for managed-node1 9396 1727204061.32964: Calling groups_inventory to load vars for managed-node1 9396 1727204061.32967: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204061.32977: Calling all_plugins_play to load vars for managed-node1 9396 1727204061.32980: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204061.32988: Calling groups_plugins_play to load vars for managed-node1 9396 1727204061.34217: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204061.37133: done with get_vars() 9396 1727204061.37170: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:54:21 -0400 (0:00:01.175) 0:00:37.343 ***** 9396 1727204061.37274: entering _queue_task() for managed-node1/fedora.linux_system_roles.network_state 9396 1727204061.37604: worker is 1 (out of 1 available) 9396 1727204061.37617: exiting _queue_task() for managed-node1/fedora.linux_system_roles.network_state 9396 1727204061.37630: done queuing things up, now waiting for results queue to drain 9396 1727204061.37632: waiting for pending results... 9396 1727204061.38010: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state 9396 1727204061.38118: in run() - task 12b410aa-8751-36c5-1f9e-00000000008e 9396 1727204061.38139: variable 'ansible_search_path' from source: unknown 9396 1727204061.38147: variable 'ansible_search_path' from source: unknown 9396 1727204061.38192: calling self._execute() 9396 1727204061.38303: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204061.38322: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204061.38340: variable 'omit' from source: magic vars 9396 1727204061.38787: variable 'ansible_distribution_major_version' from source: facts 9396 1727204061.38866: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204061.38975: variable 'network_state' from source: role '' defaults 9396 1727204061.38994: Evaluated conditional (network_state != {}): False 9396 1727204061.39003: when evaluation is False, skipping this task 9396 1727204061.39010: _execute() done 9396 1727204061.39019: dumping result to json 9396 1727204061.39027: done dumping result, returning 9396 1727204061.39038: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-36c5-1f9e-00000000008e] 9396 1727204061.39049: sending task result for task 12b410aa-8751-36c5-1f9e-00000000008e skipping: [managed-node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 9396 1727204061.39245: no more pending results, returning what we have 9396 1727204061.39251: results queue empty 9396 1727204061.39252: checking for any_errors_fatal 9396 1727204061.39265: done checking for any_errors_fatal 9396 1727204061.39266: checking for max_fail_percentage 9396 1727204061.39268: done checking for max_fail_percentage 9396 1727204061.39269: checking to see if all hosts have failed and the running result is not ok 9396 1727204061.39270: done checking to see if all hosts have failed 9396 1727204061.39271: getting the remaining hosts for this loop 9396 1727204061.39273: done getting the remaining hosts for this loop 9396 1727204061.39278: getting the next task for host managed-node1 9396 1727204061.39286: done getting next task for host managed-node1 9396 1727204061.39292: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 9396 1727204061.39297: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204061.39321: getting variables 9396 1727204061.39323: in VariableManager get_vars() 9396 1727204061.39371: Calling all_inventory to load vars for managed-node1 9396 1727204061.39374: Calling groups_inventory to load vars for managed-node1 9396 1727204061.39377: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204061.39652: Calling all_plugins_play to load vars for managed-node1 9396 1727204061.39658: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204061.39665: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000008e 9396 1727204061.39669: WORKER PROCESS EXITING 9396 1727204061.39674: Calling groups_plugins_play to load vars for managed-node1 9396 1727204061.41916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204061.44833: done with get_vars() 9396 1727204061.44867: done getting variables 9396 1727204061.44936: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:54:21 -0400 (0:00:00.076) 0:00:37.420 ***** 9396 1727204061.44976: entering _queue_task() for managed-node1/debug 9396 1727204061.45295: worker is 1 (out of 1 available) 9396 1727204061.45309: exiting _queue_task() for managed-node1/debug 9396 1727204061.45321: done queuing things up, now waiting for results queue to drain 9396 1727204061.45323: waiting for pending results... 9396 1727204061.45614: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 9396 1727204061.45801: in run() - task 12b410aa-8751-36c5-1f9e-00000000008f 9396 1727204061.45823: variable 'ansible_search_path' from source: unknown 9396 1727204061.45832: variable 'ansible_search_path' from source: unknown 9396 1727204061.45879: calling self._execute() 9396 1727204061.45993: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204061.46009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204061.46028: variable 'omit' from source: magic vars 9396 1727204061.46457: variable 'ansible_distribution_major_version' from source: facts 9396 1727204061.46478: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204061.46596: variable 'omit' from source: magic vars 9396 1727204061.46601: variable 'omit' from source: magic vars 9396 1727204061.46647: variable 'omit' from source: magic vars 9396 1727204061.46701: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204061.46755: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204061.46785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204061.46815: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204061.46842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204061.46885: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204061.46921: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204061.46935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204061.47105: Set connection var ansible_timeout to 10 9396 1727204061.47153: Set connection var ansible_shell_executable to /bin/sh 9396 1727204061.47156: Set connection var ansible_pipelining to False 9396 1727204061.47159: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204061.47161: Set connection var ansible_connection to ssh 9396 1727204061.47163: Set connection var ansible_shell_type to sh 9396 1727204061.47211: variable 'ansible_shell_executable' from source: unknown 9396 1727204061.47228: variable 'ansible_connection' from source: unknown 9396 1727204061.47261: variable 'ansible_module_compression' from source: unknown 9396 1727204061.47264: variable 'ansible_shell_type' from source: unknown 9396 1727204061.47267: variable 'ansible_shell_executable' from source: unknown 9396 1727204061.47269: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204061.47271: variable 'ansible_pipelining' from source: unknown 9396 1727204061.47273: variable 'ansible_timeout' from source: unknown 9396 1727204061.47283: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204061.47481: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204061.47485: variable 'omit' from source: magic vars 9396 1727204061.47499: starting attempt loop 9396 1727204061.47507: running the handler 9396 1727204061.47696: variable '__network_connections_result' from source: set_fact 9396 1727204061.47750: handler run complete 9396 1727204061.47788: attempt loop complete, returning result 9396 1727204061.47809: _execute() done 9396 1727204061.47820: dumping result to json 9396 1727204061.47918: done dumping result, returning 9396 1727204061.47922: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-36c5-1f9e-00000000008f] 9396 1727204061.47925: sending task result for task 12b410aa-8751-36c5-1f9e-00000000008f 9396 1727204061.48001: done sending task result for task 12b410aa-8751-36c5-1f9e-00000000008f 9396 1727204061.48004: WORKER PROCESS EXITING ok: [managed-node1] => { "__network_connections_result.stderr_lines": [ "" ] } 9396 1727204061.48109: no more pending results, returning what we have 9396 1727204061.48113: results queue empty 9396 1727204061.48115: checking for any_errors_fatal 9396 1727204061.48125: done checking for any_errors_fatal 9396 1727204061.48126: checking for max_fail_percentage 9396 1727204061.48128: done checking for max_fail_percentage 9396 1727204061.48129: checking to see if all hosts have failed and the running result is not ok 9396 1727204061.48131: done checking to see if all hosts have failed 9396 1727204061.48132: getting the remaining hosts for this loop 9396 1727204061.48133: done getting the remaining hosts for this loop 9396 1727204061.48138: getting the next task for host managed-node1 9396 1727204061.48146: done getting next task for host managed-node1 9396 1727204061.48151: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 9396 1727204061.48155: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204061.48170: getting variables 9396 1727204061.48172: in VariableManager get_vars() 9396 1727204061.48371: Calling all_inventory to load vars for managed-node1 9396 1727204061.48375: Calling groups_inventory to load vars for managed-node1 9396 1727204061.48378: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204061.48392: Calling all_plugins_play to load vars for managed-node1 9396 1727204061.48397: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204061.48401: Calling groups_plugins_play to load vars for managed-node1 9396 1727204061.50772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204061.53798: done with get_vars() 9396 1727204061.53832: done getting variables 9396 1727204061.53907: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:54:21 -0400 (0:00:00.089) 0:00:37.510 ***** 9396 1727204061.53952: entering _queue_task() for managed-node1/debug 9396 1727204061.54313: worker is 1 (out of 1 available) 9396 1727204061.54327: exiting _queue_task() for managed-node1/debug 9396 1727204061.54341: done queuing things up, now waiting for results queue to drain 9396 1727204061.54342: waiting for pending results... 9396 1727204061.54720: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 9396 1727204061.54996: in run() - task 12b410aa-8751-36c5-1f9e-000000000090 9396 1727204061.55001: variable 'ansible_search_path' from source: unknown 9396 1727204061.55004: variable 'ansible_search_path' from source: unknown 9396 1727204061.55007: calling self._execute() 9396 1727204061.55035: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204061.55052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204061.55074: variable 'omit' from source: magic vars 9396 1727204061.55547: variable 'ansible_distribution_major_version' from source: facts 9396 1727204061.55574: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204061.55591: variable 'omit' from source: magic vars 9396 1727204061.55693: variable 'omit' from source: magic vars 9396 1727204061.55748: variable 'omit' from source: magic vars 9396 1727204061.55811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204061.55860: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204061.55898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204061.55927: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204061.55996: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204061.56000: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204061.56003: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204061.56012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204061.56147: Set connection var ansible_timeout to 10 9396 1727204061.56162: Set connection var ansible_shell_executable to /bin/sh 9396 1727204061.56179: Set connection var ansible_pipelining to False 9396 1727204061.56193: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204061.56205: Set connection var ansible_connection to ssh 9396 1727204061.56216: Set connection var ansible_shell_type to sh 9396 1727204061.56250: variable 'ansible_shell_executable' from source: unknown 9396 1727204061.56324: variable 'ansible_connection' from source: unknown 9396 1727204061.56328: variable 'ansible_module_compression' from source: unknown 9396 1727204061.56330: variable 'ansible_shell_type' from source: unknown 9396 1727204061.56332: variable 'ansible_shell_executable' from source: unknown 9396 1727204061.56334: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204061.56336: variable 'ansible_pipelining' from source: unknown 9396 1727204061.56339: variable 'ansible_timeout' from source: unknown 9396 1727204061.56341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204061.56481: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204061.56503: variable 'omit' from source: magic vars 9396 1727204061.56515: starting attempt loop 9396 1727204061.56522: running the handler 9396 1727204061.56586: variable '__network_connections_result' from source: set_fact 9396 1727204061.56695: variable '__network_connections_result' from source: set_fact 9396 1727204061.56875: handler run complete 9396 1727204061.56920: attempt loop complete, returning result 9396 1727204061.56976: _execute() done 9396 1727204061.56980: dumping result to json 9396 1727204061.56982: done dumping result, returning 9396 1727204061.56985: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-36c5-1f9e-000000000090] 9396 1727204061.56987: sending task result for task 12b410aa-8751-36c5-1f9e-000000000090 ok: [managed-node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 9396 1727204061.57201: no more pending results, returning what we have 9396 1727204061.57206: results queue empty 9396 1727204061.57207: checking for any_errors_fatal 9396 1727204061.57213: done checking for any_errors_fatal 9396 1727204061.57215: checking for max_fail_percentage 9396 1727204061.57217: done checking for max_fail_percentage 9396 1727204061.57218: checking to see if all hosts have failed and the running result is not ok 9396 1727204061.57219: done checking to see if all hosts have failed 9396 1727204061.57220: getting the remaining hosts for this loop 9396 1727204061.57222: done getting the remaining hosts for this loop 9396 1727204061.57227: getting the next task for host managed-node1 9396 1727204061.57235: done getting next task for host managed-node1 9396 1727204061.57240: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 9396 1727204061.57245: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204061.57260: getting variables 9396 1727204061.57262: in VariableManager get_vars() 9396 1727204061.57609: Calling all_inventory to load vars for managed-node1 9396 1727204061.57612: Calling groups_inventory to load vars for managed-node1 9396 1727204061.57616: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204061.57627: Calling all_plugins_play to load vars for managed-node1 9396 1727204061.57630: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204061.57634: Calling groups_plugins_play to load vars for managed-node1 9396 1727204061.58306: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000090 9396 1727204061.58309: WORKER PROCESS EXITING 9396 1727204061.59749: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204061.62960: done with get_vars() 9396 1727204061.63010: done getting variables 9396 1727204061.63102: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:54:21 -0400 (0:00:00.093) 0:00:37.603 ***** 9396 1727204061.63254: entering _queue_task() for managed-node1/debug 9396 1727204061.63816: worker is 1 (out of 1 available) 9396 1727204061.63828: exiting _queue_task() for managed-node1/debug 9396 1727204061.63839: done queuing things up, now waiting for results queue to drain 9396 1727204061.63841: waiting for pending results... 9396 1727204061.63963: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 9396 1727204061.64160: in run() - task 12b410aa-8751-36c5-1f9e-000000000091 9396 1727204061.64176: variable 'ansible_search_path' from source: unknown 9396 1727204061.64180: variable 'ansible_search_path' from source: unknown 9396 1727204061.64223: calling self._execute() 9396 1727204061.64330: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204061.64339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204061.64357: variable 'omit' from source: magic vars 9396 1727204061.64797: variable 'ansible_distribution_major_version' from source: facts 9396 1727204061.64813: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204061.64974: variable 'network_state' from source: role '' defaults 9396 1727204061.64986: Evaluated conditional (network_state != {}): False 9396 1727204061.64991: when evaluation is False, skipping this task 9396 1727204061.64995: _execute() done 9396 1727204061.65000: dumping result to json 9396 1727204061.65012: done dumping result, returning 9396 1727204061.65022: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-36c5-1f9e-000000000091] 9396 1727204061.65029: sending task result for task 12b410aa-8751-36c5-1f9e-000000000091 9396 1727204061.65132: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000091 9396 1727204061.65135: WORKER PROCESS EXITING skipping: [managed-node1] => { "false_condition": "network_state != {}" } 9396 1727204061.65215: no more pending results, returning what we have 9396 1727204061.65219: results queue empty 9396 1727204061.65221: checking for any_errors_fatal 9396 1727204061.65232: done checking for any_errors_fatal 9396 1727204061.65233: checking for max_fail_percentage 9396 1727204061.65236: done checking for max_fail_percentage 9396 1727204061.65237: checking to see if all hosts have failed and the running result is not ok 9396 1727204061.65238: done checking to see if all hosts have failed 9396 1727204061.65238: getting the remaining hosts for this loop 9396 1727204061.65240: done getting the remaining hosts for this loop 9396 1727204061.65244: getting the next task for host managed-node1 9396 1727204061.65251: done getting next task for host managed-node1 9396 1727204061.65256: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 9396 1727204061.65260: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204061.65277: getting variables 9396 1727204061.65279: in VariableManager get_vars() 9396 1727204061.65318: Calling all_inventory to load vars for managed-node1 9396 1727204061.65322: Calling groups_inventory to load vars for managed-node1 9396 1727204061.65324: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204061.65334: Calling all_plugins_play to load vars for managed-node1 9396 1727204061.65338: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204061.65341: Calling groups_plugins_play to load vars for managed-node1 9396 1727204061.67562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204061.70438: done with get_vars() 9396 1727204061.70470: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:54:21 -0400 (0:00:00.073) 0:00:37.676 ***** 9396 1727204061.70583: entering _queue_task() for managed-node1/ping 9396 1727204061.70892: worker is 1 (out of 1 available) 9396 1727204061.70907: exiting _queue_task() for managed-node1/ping 9396 1727204061.70921: done queuing things up, now waiting for results queue to drain 9396 1727204061.70922: waiting for pending results... 9396 1727204061.71312: running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 9396 1727204061.71395: in run() - task 12b410aa-8751-36c5-1f9e-000000000092 9396 1727204061.71422: variable 'ansible_search_path' from source: unknown 9396 1727204061.71431: variable 'ansible_search_path' from source: unknown 9396 1727204061.71475: calling self._execute() 9396 1727204061.71591: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204061.71606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204061.71629: variable 'omit' from source: magic vars 9396 1727204061.72076: variable 'ansible_distribution_major_version' from source: facts 9396 1727204061.72098: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204061.72110: variable 'omit' from source: magic vars 9396 1727204061.72220: variable 'omit' from source: magic vars 9396 1727204061.72268: variable 'omit' from source: magic vars 9396 1727204061.72386: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204061.72392: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204061.72398: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204061.72425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204061.72444: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204061.72483: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204061.72498: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204061.72507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204061.72645: Set connection var ansible_timeout to 10 9396 1727204061.72659: Set connection var ansible_shell_executable to /bin/sh 9396 1727204061.72674: Set connection var ansible_pipelining to False 9396 1727204061.72687: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204061.72702: Set connection var ansible_connection to ssh 9396 1727204061.72824: Set connection var ansible_shell_type to sh 9396 1727204061.72827: variable 'ansible_shell_executable' from source: unknown 9396 1727204061.72830: variable 'ansible_connection' from source: unknown 9396 1727204061.72833: variable 'ansible_module_compression' from source: unknown 9396 1727204061.72835: variable 'ansible_shell_type' from source: unknown 9396 1727204061.72837: variable 'ansible_shell_executable' from source: unknown 9396 1727204061.72839: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204061.72841: variable 'ansible_pipelining' from source: unknown 9396 1727204061.72843: variable 'ansible_timeout' from source: unknown 9396 1727204061.72845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204061.73044: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 9396 1727204061.73063: variable 'omit' from source: magic vars 9396 1727204061.73075: starting attempt loop 9396 1727204061.73082: running the handler 9396 1727204061.73106: _low_level_execute_command(): starting 9396 1727204061.73120: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204061.73884: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204061.73924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204061.73944: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204061.74038: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204061.74062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204061.74077: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204061.74105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204061.74183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204061.75977: stdout chunk (state=3): >>>/root <<< 9396 1727204061.76197: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204061.76202: stdout chunk (state=3): >>><<< 9396 1727204061.76205: stderr chunk (state=3): >>><<< 9396 1727204061.76230: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204061.76333: _low_level_execute_command(): starting 9396 1727204061.76338: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019 `" && echo ansible-tmp-1727204061.7623634-11799-204330491577019="` echo /root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019 `" ) && sleep 0' 9396 1727204061.76874: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204061.76891: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204061.76908: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204061.76933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204061.77007: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204061.77073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204061.77093: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204061.77167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204061.79174: stdout chunk (state=3): >>>ansible-tmp-1727204061.7623634-11799-204330491577019=/root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019 <<< 9396 1727204061.79405: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204061.79408: stdout chunk (state=3): >>><<< 9396 1727204061.79411: stderr chunk (state=3): >>><<< 9396 1727204061.79434: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204061.7623634-11799-204330491577019=/root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204061.79488: variable 'ansible_module_compression' from source: unknown 9396 1727204061.79549: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 9396 1727204061.79592: variable 'ansible_facts' from source: unknown 9396 1727204061.79693: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019/AnsiballZ_ping.py 9396 1727204061.79925: Sending initial data 9396 1727204061.79928: Sent initial data (152 bytes) 9396 1727204061.80476: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204061.80486: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204061.80605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 9396 1727204061.80623: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204061.80636: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204061.80704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204061.82412: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 9396 1727204061.82428: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 9396 1727204061.82447: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 9396 1727204061.82475: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204061.82524: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204061.82580: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpmbk51k9i /root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019/AnsiballZ_ping.py <<< 9396 1727204061.82583: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019/AnsiballZ_ping.py" <<< 9396 1727204061.82629: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpmbk51k9i" to remote "/root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019/AnsiballZ_ping.py" <<< 9396 1727204061.83655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204061.83735: stderr chunk (state=3): >>><<< 9396 1727204061.83862: stdout chunk (state=3): >>><<< 9396 1727204061.83866: done transferring module to remote 9396 1727204061.83868: _low_level_execute_command(): starting 9396 1727204061.83871: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019/ /root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019/AnsiballZ_ping.py && sleep 0' 9396 1727204061.84430: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204061.84444: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204061.84505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204061.84578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204061.84601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204061.84635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204061.84704: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204061.86717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204061.86723: stdout chunk (state=3): >>><<< 9396 1727204061.86725: stderr chunk (state=3): >>><<< 9396 1727204061.86744: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204061.86754: _low_level_execute_command(): starting 9396 1727204061.86762: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019/AnsiballZ_ping.py && sleep 0' 9396 1727204061.87431: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204061.87447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204061.87507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204061.87580: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204061.87601: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204061.87635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204061.87720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204062.05864: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 9396 1727204062.07442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204062.07503: stderr chunk (state=3): >>><<< 9396 1727204062.07510: stdout chunk (state=3): >>><<< 9396 1727204062.07524: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204062.07548: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204062.07562: _low_level_execute_command(): starting 9396 1727204062.07569: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204061.7623634-11799-204330491577019/ > /dev/null 2>&1 && sleep 0' 9396 1727204062.08046: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204062.08049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.08053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204062.08055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.08110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204062.08115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204062.08166: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204062.10211: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204062.10265: stderr chunk (state=3): >>><<< 9396 1727204062.10270: stdout chunk (state=3): >>><<< 9396 1727204062.10291: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204062.10301: handler run complete 9396 1727204062.10319: attempt loop complete, returning result 9396 1727204062.10322: _execute() done 9396 1727204062.10326: dumping result to json 9396 1727204062.10331: done dumping result, returning 9396 1727204062.10342: done running TaskExecutor() for managed-node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-36c5-1f9e-000000000092] 9396 1727204062.10347: sending task result for task 12b410aa-8751-36c5-1f9e-000000000092 9396 1727204062.10448: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000092 9396 1727204062.10451: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "ping": "pong" } 9396 1727204062.10530: no more pending results, returning what we have 9396 1727204062.10535: results queue empty 9396 1727204062.10536: checking for any_errors_fatal 9396 1727204062.10546: done checking for any_errors_fatal 9396 1727204062.10547: checking for max_fail_percentage 9396 1727204062.10549: done checking for max_fail_percentage 9396 1727204062.10550: checking to see if all hosts have failed and the running result is not ok 9396 1727204062.10551: done checking to see if all hosts have failed 9396 1727204062.10552: getting the remaining hosts for this loop 9396 1727204062.10553: done getting the remaining hosts for this loop 9396 1727204062.10558: getting the next task for host managed-node1 9396 1727204062.10577: done getting next task for host managed-node1 9396 1727204062.10580: ^ task is: TASK: meta (role_complete) 9396 1727204062.10585: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204062.10600: getting variables 9396 1727204062.10602: in VariableManager get_vars() 9396 1727204062.10646: Calling all_inventory to load vars for managed-node1 9396 1727204062.10649: Calling groups_inventory to load vars for managed-node1 9396 1727204062.10652: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204062.10662: Calling all_plugins_play to load vars for managed-node1 9396 1727204062.10665: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204062.10671: Calling groups_plugins_play to load vars for managed-node1 9396 1727204062.11902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204062.13563: done with get_vars() 9396 1727204062.13586: done getting variables 9396 1727204062.13664: done queuing things up, now waiting for results queue to drain 9396 1727204062.13666: results queue empty 9396 1727204062.13666: checking for any_errors_fatal 9396 1727204062.13669: done checking for any_errors_fatal 9396 1727204062.13669: checking for max_fail_percentage 9396 1727204062.13670: done checking for max_fail_percentage 9396 1727204062.13671: checking to see if all hosts have failed and the running result is not ok 9396 1727204062.13671: done checking to see if all hosts have failed 9396 1727204062.13672: getting the remaining hosts for this loop 9396 1727204062.13673: done getting the remaining hosts for this loop 9396 1727204062.13675: getting the next task for host managed-node1 9396 1727204062.13678: done getting next task for host managed-node1 9396 1727204062.13680: ^ task is: TASK: Delete the device '{{ controller_device }}' 9396 1727204062.13681: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204062.13683: getting variables 9396 1727204062.13684: in VariableManager get_vars() 9396 1727204062.13698: Calling all_inventory to load vars for managed-node1 9396 1727204062.13700: Calling groups_inventory to load vars for managed-node1 9396 1727204062.13702: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204062.13706: Calling all_plugins_play to load vars for managed-node1 9396 1727204062.13710: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204062.13712: Calling groups_plugins_play to load vars for managed-node1 9396 1727204062.14778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204062.16329: done with get_vars() 9396 1727204062.16352: done getting variables 9396 1727204062.16390: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 9396 1727204062.16494: variable 'controller_device' from source: play vars TASK [Delete the device 'deprecated-bond'] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:125 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.459) 0:00:38.136 ***** 9396 1727204062.16522: entering _queue_task() for managed-node1/command 9396 1727204062.16800: worker is 1 (out of 1 available) 9396 1727204062.16819: exiting _queue_task() for managed-node1/command 9396 1727204062.16832: done queuing things up, now waiting for results queue to drain 9396 1727204062.16834: waiting for pending results... 9396 1727204062.17028: running TaskExecutor() for managed-node1/TASK: Delete the device 'deprecated-bond' 9396 1727204062.17113: in run() - task 12b410aa-8751-36c5-1f9e-0000000000c2 9396 1727204062.17126: variable 'ansible_search_path' from source: unknown 9396 1727204062.17159: calling self._execute() 9396 1727204062.17246: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204062.17253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204062.17265: variable 'omit' from source: magic vars 9396 1727204062.17575: variable 'ansible_distribution_major_version' from source: facts 9396 1727204062.17587: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204062.17596: variable 'omit' from source: magic vars 9396 1727204062.17620: variable 'omit' from source: magic vars 9396 1727204062.17700: variable 'controller_device' from source: play vars 9396 1727204062.17718: variable 'omit' from source: magic vars 9396 1727204062.17756: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204062.17787: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204062.17811: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204062.17829: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204062.17842: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204062.17870: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204062.17873: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204062.17878: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204062.17965: Set connection var ansible_timeout to 10 9396 1727204062.17971: Set connection var ansible_shell_executable to /bin/sh 9396 1727204062.17980: Set connection var ansible_pipelining to False 9396 1727204062.17987: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204062.17995: Set connection var ansible_connection to ssh 9396 1727204062.17998: Set connection var ansible_shell_type to sh 9396 1727204062.18022: variable 'ansible_shell_executable' from source: unknown 9396 1727204062.18025: variable 'ansible_connection' from source: unknown 9396 1727204062.18028: variable 'ansible_module_compression' from source: unknown 9396 1727204062.18031: variable 'ansible_shell_type' from source: unknown 9396 1727204062.18037: variable 'ansible_shell_executable' from source: unknown 9396 1727204062.18040: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204062.18047: variable 'ansible_pipelining' from source: unknown 9396 1727204062.18051: variable 'ansible_timeout' from source: unknown 9396 1727204062.18053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204062.18175: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204062.18187: variable 'omit' from source: magic vars 9396 1727204062.18194: starting attempt loop 9396 1727204062.18197: running the handler 9396 1727204062.18214: _low_level_execute_command(): starting 9396 1727204062.18221: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204062.18776: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204062.18780: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.18783: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204062.18785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.18849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204062.18856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204062.18902: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204062.20646: stdout chunk (state=3): >>>/root <<< 9396 1727204062.20809: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204062.20861: stderr chunk (state=3): >>><<< 9396 1727204062.20865: stdout chunk (state=3): >>><<< 9396 1727204062.20890: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204062.20909: _low_level_execute_command(): starting 9396 1727204062.20914: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256 `" && echo ansible-tmp-1727204062.208883-11822-50023996363256="` echo /root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256 `" ) && sleep 0' 9396 1727204062.21396: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204062.21399: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.21485: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204062.21530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204062.21584: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204062.23625: stdout chunk (state=3): >>>ansible-tmp-1727204062.208883-11822-50023996363256=/root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256 <<< 9396 1727204062.23825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204062.23891: stderr chunk (state=3): >>><<< 9396 1727204062.23895: stdout chunk (state=3): >>><<< 9396 1727204062.23915: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204062.208883-11822-50023996363256=/root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204062.23947: variable 'ansible_module_compression' from source: unknown 9396 1727204062.23996: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9396 1727204062.24028: variable 'ansible_facts' from source: unknown 9396 1727204062.24094: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256/AnsiballZ_command.py 9396 1727204062.24317: Sending initial data 9396 1727204062.24321: Sent initial data (153 bytes) 9396 1727204062.25117: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204062.25226: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204062.26879: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 9396 1727204062.26895: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 9396 1727204062.26911: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 9396 1727204062.26922: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 9396 1727204062.26934: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 9396 1727204062.26947: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 9396 1727204062.26977: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204062.27045: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204062.27111: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpc0f4e69x /root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256/AnsiballZ_command.py <<< 9396 1727204062.27122: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256/AnsiballZ_command.py" <<< 9396 1727204062.27167: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpc0f4e69x" to remote "/root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256/AnsiballZ_command.py" <<< 9396 1727204062.28333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204062.28362: stdout chunk (state=3): >>><<< 9396 1727204062.28375: stderr chunk (state=3): >>><<< 9396 1727204062.28457: done transferring module to remote 9396 1727204062.28464: _low_level_execute_command(): starting 9396 1727204062.28470: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256/ /root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256/AnsiballZ_command.py && sleep 0' 9396 1727204062.29104: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204062.29136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204062.29211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.29272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204062.29297: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204062.29317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204062.29413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204062.31696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204062.31700: stdout chunk (state=3): >>><<< 9396 1727204062.31703: stderr chunk (state=3): >>><<< 9396 1727204062.31705: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204062.31716: _low_level_execute_command(): starting 9396 1727204062.31719: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256/AnsiballZ_command.py && sleep 0' 9396 1727204062.32427: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204062.32442: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204062.32458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204062.32476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204062.32504: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204062.32520: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204062.32535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.32606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.32651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204062.32679: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204062.32702: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204062.32799: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204062.52402: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-24 14:54:22.511395", "end": "2024-09-24 14:54:22.523047", "delta": "0:00:00.011652", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9396 1727204062.54221: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.11.210 closed. <<< 9396 1727204062.54225: stdout chunk (state=3): >>><<< 9396 1727204062.54254: stderr chunk (state=3): >>><<< 9396 1727204062.54340: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"deprecated-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "deprecated-bond"], "start": "2024-09-24 14:54:22.511395", "end": "2024-09-24 14:54:22.523047", "delta": "0:00:00.011652", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del deprecated-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.11.210 closed. 9396 1727204062.54407: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del deprecated-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204062.54429: _low_level_execute_command(): starting 9396 1727204062.54448: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204062.208883-11822-50023996363256/ > /dev/null 2>&1 && sleep 0' 9396 1727204062.55408: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204062.55446: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.55552: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204062.55596: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204062.55633: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204062.57696: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204062.57706: stdout chunk (state=3): >>><<< 9396 1727204062.57718: stderr chunk (state=3): >>><<< 9396 1727204062.57738: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204062.57801: handler run complete 9396 1727204062.57805: Evaluated conditional (False): False 9396 1727204062.57807: Evaluated conditional (False): False 9396 1727204062.57826: attempt loop complete, returning result 9396 1727204062.57834: _execute() done 9396 1727204062.57841: dumping result to json 9396 1727204062.57852: done dumping result, returning 9396 1727204062.57864: done running TaskExecutor() for managed-node1/TASK: Delete the device 'deprecated-bond' [12b410aa-8751-36c5-1f9e-0000000000c2] 9396 1727204062.57875: sending task result for task 12b410aa-8751-36c5-1f9e-0000000000c2 ok: [managed-node1] => { "changed": false, "cmd": [ "ip", "link", "del", "deprecated-bond" ], "delta": "0:00:00.011652", "end": "2024-09-24 14:54:22.523047", "failed_when_result": false, "rc": 1, "start": "2024-09-24 14:54:22.511395" } STDERR: Cannot find device "deprecated-bond" MSG: non-zero return code 9396 1727204062.58281: no more pending results, returning what we have 9396 1727204062.58286: results queue empty 9396 1727204062.58287: checking for any_errors_fatal 9396 1727204062.58291: done checking for any_errors_fatal 9396 1727204062.58292: checking for max_fail_percentage 9396 1727204062.58295: done checking for max_fail_percentage 9396 1727204062.58296: checking to see if all hosts have failed and the running result is not ok 9396 1727204062.58297: done checking to see if all hosts have failed 9396 1727204062.58298: getting the remaining hosts for this loop 9396 1727204062.58300: done getting the remaining hosts for this loop 9396 1727204062.58305: getting the next task for host managed-node1 9396 1727204062.58313: done getting next task for host managed-node1 9396 1727204062.58316: ^ task is: TASK: Remove test interfaces 9396 1727204062.58321: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204062.58326: getting variables 9396 1727204062.58328: in VariableManager get_vars() 9396 1727204062.58377: Calling all_inventory to load vars for managed-node1 9396 1727204062.58381: Calling groups_inventory to load vars for managed-node1 9396 1727204062.58384: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204062.58514: Calling all_plugins_play to load vars for managed-node1 9396 1727204062.58519: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204062.58525: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000000c2 9396 1727204062.58528: WORKER PROCESS EXITING 9396 1727204062.58534: Calling groups_plugins_play to load vars for managed-node1 9396 1727204062.65453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204062.67013: done with get_vars() 9396 1727204062.67036: done getting variables 9396 1727204062.67075: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Tuesday 24 September 2024 14:54:22 -0400 (0:00:00.505) 0:00:38.641 ***** 9396 1727204062.67100: entering _queue_task() for managed-node1/shell 9396 1727204062.67455: worker is 1 (out of 1 available) 9396 1727204062.67468: exiting _queue_task() for managed-node1/shell 9396 1727204062.67482: done queuing things up, now waiting for results queue to drain 9396 1727204062.67484: waiting for pending results... 9396 1727204062.67806: running TaskExecutor() for managed-node1/TASK: Remove test interfaces 9396 1727204062.67997: in run() - task 12b410aa-8751-36c5-1f9e-0000000000c6 9396 1727204062.68023: variable 'ansible_search_path' from source: unknown 9396 1727204062.68039: variable 'ansible_search_path' from source: unknown 9396 1727204062.68087: calling self._execute() 9396 1727204062.68211: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204062.68229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204062.68251: variable 'omit' from source: magic vars 9396 1727204062.68722: variable 'ansible_distribution_major_version' from source: facts 9396 1727204062.68744: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204062.68767: variable 'omit' from source: magic vars 9396 1727204062.68838: variable 'omit' from source: magic vars 9396 1727204062.68980: variable 'dhcp_interface1' from source: play vars 9396 1727204062.68985: variable 'dhcp_interface2' from source: play vars 9396 1727204062.69008: variable 'omit' from source: magic vars 9396 1727204062.69047: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204062.69076: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204062.69099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204062.69120: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204062.69131: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204062.69160: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204062.69163: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204062.69166: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204062.69254: Set connection var ansible_timeout to 10 9396 1727204062.69261: Set connection var ansible_shell_executable to /bin/sh 9396 1727204062.69270: Set connection var ansible_pipelining to False 9396 1727204062.69276: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204062.69282: Set connection var ansible_connection to ssh 9396 1727204062.69285: Set connection var ansible_shell_type to sh 9396 1727204062.69314: variable 'ansible_shell_executable' from source: unknown 9396 1727204062.69317: variable 'ansible_connection' from source: unknown 9396 1727204062.69320: variable 'ansible_module_compression' from source: unknown 9396 1727204062.69323: variable 'ansible_shell_type' from source: unknown 9396 1727204062.69329: variable 'ansible_shell_executable' from source: unknown 9396 1727204062.69331: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204062.69338: variable 'ansible_pipelining' from source: unknown 9396 1727204062.69341: variable 'ansible_timeout' from source: unknown 9396 1727204062.69346: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204062.69485: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204062.69498: variable 'omit' from source: magic vars 9396 1727204062.69504: starting attempt loop 9396 1727204062.69508: running the handler 9396 1727204062.69521: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204062.69539: _low_level_execute_command(): starting 9396 1727204062.69546: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204062.70055: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204062.70087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.70095: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 9396 1727204062.70101: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.70150: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204062.70154: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204062.70213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204062.72064: stdout chunk (state=3): >>>/root <<< 9396 1727204062.72183: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204062.72242: stderr chunk (state=3): >>><<< 9396 1727204062.72244: stdout chunk (state=3): >>><<< 9396 1727204062.72259: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204062.72282: _low_level_execute_command(): starting 9396 1727204062.72355: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961 `" && echo ansible-tmp-1727204062.7226472-11838-115056140212961="` echo /root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961 `" ) && sleep 0' 9396 1727204062.72694: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204062.72721: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.72766: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204062.72783: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204062.72828: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204062.75013: stdout chunk (state=3): >>>ansible-tmp-1727204062.7226472-11838-115056140212961=/root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961 <<< 9396 1727204062.75017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204062.75111: stderr chunk (state=3): >>><<< 9396 1727204062.75115: stdout chunk (state=3): >>><<< 9396 1727204062.75166: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204062.7226472-11838-115056140212961=/root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204062.75193: variable 'ansible_module_compression' from source: unknown 9396 1727204062.75271: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9396 1727204062.75287: variable 'ansible_facts' from source: unknown 9396 1727204062.75380: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961/AnsiballZ_command.py 9396 1727204062.75617: Sending initial data 9396 1727204062.75621: Sent initial data (155 bytes) 9396 1727204062.76074: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204062.76092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration <<< 9396 1727204062.76106: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.76168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204062.76172: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204062.76214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204062.77920: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204062.78044: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204062.78136: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmp698bm976 /root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961/AnsiballZ_command.py <<< 9396 1727204062.78139: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961/AnsiballZ_command.py" <<< 9396 1727204062.78169: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmp698bm976" to remote "/root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961/AnsiballZ_command.py" <<< 9396 1727204062.79380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204062.79419: stderr chunk (state=3): >>><<< 9396 1727204062.79429: stdout chunk (state=3): >>><<< 9396 1727204062.79475: done transferring module to remote 9396 1727204062.79504: _low_level_execute_command(): starting 9396 1727204062.79518: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961/ /root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961/AnsiballZ_command.py && sleep 0' 9396 1727204062.80192: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204062.80268: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.80424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204062.80526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204062.80556: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204062.82494: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204062.82567: stderr chunk (state=3): >>><<< 9396 1727204062.82570: stdout chunk (state=3): >>><<< 9396 1727204062.82593: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204062.82598: _low_level_execute_command(): starting 9396 1727204062.82688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961/AnsiballZ_command.py && sleep 0' 9396 1727204062.83205: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204062.83212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204062.83295: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204062.83299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 9396 1727204062.83302: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204062.83305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204062.83340: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204062.83346: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204062.83383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204062.83441: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204063.05659: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:54:23.018075", "end": "2024-09-24 14:54:23.055569", "delta": "0:00:00.037494", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9396 1727204063.07500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204063.07537: stderr chunk (state=3): >>>Shared connection to 10.31.11.210 closed. <<< 9396 1727204063.07541: stdout chunk (state=3): >>><<< 9396 1727204063.07544: stderr chunk (state=3): >>><<< 9396 1727204063.07571: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-24 14:54:23.018075", "end": "2024-09-24 14:54:23.055569", "delta": "0:00:00.037494", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204063.07661: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204063.07664: _low_level_execute_command(): starting 9396 1727204063.07667: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204062.7226472-11838-115056140212961/ > /dev/null 2>&1 && sleep 0' 9396 1727204063.08407: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204063.08457: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204063.08476: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204063.08499: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204063.08570: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204063.10647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204063.10674: stdout chunk (state=3): >>><<< 9396 1727204063.10678: stderr chunk (state=3): >>><<< 9396 1727204063.10700: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204063.10896: handler run complete 9396 1727204063.10899: Evaluated conditional (False): False 9396 1727204063.10902: attempt loop complete, returning result 9396 1727204063.10905: _execute() done 9396 1727204063.10908: dumping result to json 9396 1727204063.10910: done dumping result, returning 9396 1727204063.10912: done running TaskExecutor() for managed-node1/TASK: Remove test interfaces [12b410aa-8751-36c5-1f9e-0000000000c6] 9396 1727204063.10915: sending task result for task 12b410aa-8751-36c5-1f9e-0000000000c6 ok: [managed-node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.037494", "end": "2024-09-24 14:54:23.055569", "rc": 0, "start": "2024-09-24 14:54:23.018075" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 9396 1727204063.11094: no more pending results, returning what we have 9396 1727204063.11099: results queue empty 9396 1727204063.11101: checking for any_errors_fatal 9396 1727204063.11112: done checking for any_errors_fatal 9396 1727204063.11114: checking for max_fail_percentage 9396 1727204063.11116: done checking for max_fail_percentage 9396 1727204063.11117: checking to see if all hosts have failed and the running result is not ok 9396 1727204063.11118: done checking to see if all hosts have failed 9396 1727204063.11119: getting the remaining hosts for this loop 9396 1727204063.11121: done getting the remaining hosts for this loop 9396 1727204063.11126: getting the next task for host managed-node1 9396 1727204063.11134: done getting next task for host managed-node1 9396 1727204063.11139: ^ task is: TASK: Stop dnsmasq/radvd services 9396 1727204063.11144: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204063.11151: getting variables 9396 1727204063.11154: in VariableManager get_vars() 9396 1727204063.11309: Calling all_inventory to load vars for managed-node1 9396 1727204063.11313: Calling groups_inventory to load vars for managed-node1 9396 1727204063.11317: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204063.11331: Calling all_plugins_play to load vars for managed-node1 9396 1727204063.11335: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204063.11339: Calling groups_plugins_play to load vars for managed-node1 9396 1727204063.12105: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000000c6 9396 1727204063.12111: WORKER PROCESS EXITING 9396 1727204063.12942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204063.15048: done with get_vars() 9396 1727204063.15071: done getting variables 9396 1727204063.15130: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.480) 0:00:39.122 ***** 9396 1727204063.15158: entering _queue_task() for managed-node1/shell 9396 1727204063.15418: worker is 1 (out of 1 available) 9396 1727204063.15435: exiting _queue_task() for managed-node1/shell 9396 1727204063.15447: done queuing things up, now waiting for results queue to drain 9396 1727204063.15449: waiting for pending results... 9396 1727204063.15645: running TaskExecutor() for managed-node1/TASK: Stop dnsmasq/radvd services 9396 1727204063.15763: in run() - task 12b410aa-8751-36c5-1f9e-0000000000c7 9396 1727204063.15777: variable 'ansible_search_path' from source: unknown 9396 1727204063.15782: variable 'ansible_search_path' from source: unknown 9396 1727204063.15819: calling self._execute() 9396 1727204063.15904: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204063.15994: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204063.15997: variable 'omit' from source: magic vars 9396 1727204063.16255: variable 'ansible_distribution_major_version' from source: facts 9396 1727204063.16267: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204063.16274: variable 'omit' from source: magic vars 9396 1727204063.16323: variable 'omit' from source: magic vars 9396 1727204063.16358: variable 'omit' from source: magic vars 9396 1727204063.16393: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204063.16426: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204063.16449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204063.16466: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204063.16478: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204063.16507: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204063.16513: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204063.16517: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204063.16604: Set connection var ansible_timeout to 10 9396 1727204063.16612: Set connection var ansible_shell_executable to /bin/sh 9396 1727204063.16621: Set connection var ansible_pipelining to False 9396 1727204063.16628: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204063.16634: Set connection var ansible_connection to ssh 9396 1727204063.16637: Set connection var ansible_shell_type to sh 9396 1727204063.16663: variable 'ansible_shell_executable' from source: unknown 9396 1727204063.16668: variable 'ansible_connection' from source: unknown 9396 1727204063.16672: variable 'ansible_module_compression' from source: unknown 9396 1727204063.16674: variable 'ansible_shell_type' from source: unknown 9396 1727204063.16676: variable 'ansible_shell_executable' from source: unknown 9396 1727204063.16678: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204063.16681: variable 'ansible_pipelining' from source: unknown 9396 1727204063.16687: variable 'ansible_timeout' from source: unknown 9396 1727204063.16692: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204063.16818: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204063.16830: variable 'omit' from source: magic vars 9396 1727204063.16836: starting attempt loop 9396 1727204063.16839: running the handler 9396 1727204063.16849: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204063.16867: _low_level_execute_command(): starting 9396 1727204063.16876: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204063.17646: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 9396 1727204063.17655: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204063.17696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204063.19508: stdout chunk (state=3): >>>/root <<< 9396 1727204063.19613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204063.19663: stderr chunk (state=3): >>><<< 9396 1727204063.19667: stdout chunk (state=3): >>><<< 9396 1727204063.19689: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204063.19704: _low_level_execute_command(): starting 9396 1727204063.19712: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527 `" && echo ansible-tmp-1727204063.1969116-11870-157443004113527="` echo /root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527 `" ) && sleep 0' 9396 1727204063.20311: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204063.20352: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204063.20400: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204063.20404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204063.20485: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204063.22601: stdout chunk (state=3): >>>ansible-tmp-1727204063.1969116-11870-157443004113527=/root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527 <<< 9396 1727204063.22719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204063.22779: stderr chunk (state=3): >>><<< 9396 1727204063.22783: stdout chunk (state=3): >>><<< 9396 1727204063.22804: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204063.1969116-11870-157443004113527=/root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204063.22837: variable 'ansible_module_compression' from source: unknown 9396 1727204063.22884: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9396 1727204063.22922: variable 'ansible_facts' from source: unknown 9396 1727204063.22980: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527/AnsiballZ_command.py 9396 1727204063.23106: Sending initial data 9396 1727204063.23109: Sent initial data (155 bytes) 9396 1727204063.23561: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204063.23596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204063.23600: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204063.23602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204063.23606: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204063.23610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204063.23714: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204063.23744: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204063.23816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204063.25613: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204063.25688: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204063.25731: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpp5t1q84o /root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527/AnsiballZ_command.py <<< 9396 1727204063.25735: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527/AnsiballZ_command.py" <<< 9396 1727204063.25764: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpp5t1q84o" to remote "/root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527/AnsiballZ_command.py" <<< 9396 1727204063.25772: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527/AnsiballZ_command.py" <<< 9396 1727204063.26586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204063.26665: stderr chunk (state=3): >>><<< 9396 1727204063.26669: stdout chunk (state=3): >>><<< 9396 1727204063.26691: done transferring module to remote 9396 1727204063.26707: _low_level_execute_command(): starting 9396 1727204063.26716: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527/ /root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527/AnsiballZ_command.py && sleep 0' 9396 1727204063.27181: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204063.27185: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204063.27228: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204063.27231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204063.27234: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204063.27237: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204063.27239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204063.27297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204063.27300: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204063.27358: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204063.29331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204063.29386: stderr chunk (state=3): >>><<< 9396 1727204063.29393: stdout chunk (state=3): >>><<< 9396 1727204063.29417: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204063.29421: _low_level_execute_command(): starting 9396 1727204063.29427: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527/AnsiballZ_command.py && sleep 0' 9396 1727204063.29893: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204063.29932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204063.29936: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204063.29939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204063.29941: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204063.29943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204063.29945: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204063.29999: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204063.30002: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204063.30059: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204063.51019: stdout chunk (state=3): >>> <<< 9396 1727204063.51024: stdout chunk (state=3): >>>{"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:54:23.477495", "end": "2024-09-24 14:54:23.508019", "delta": "0:00:00.030524", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9396 1727204063.52781: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204063.52847: stderr chunk (state=3): >>><<< 9396 1727204063.52850: stdout chunk (state=3): >>><<< 9396 1727204063.52871: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-24 14:54:23.477495", "end": "2024-09-24 14:54:23.508019", "delta": "0:00:00.030524", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204063.52919: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204063.52928: _low_level_execute_command(): starting 9396 1727204063.52934: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204063.1969116-11870-157443004113527/ > /dev/null 2>&1 && sleep 0' 9396 1727204063.53424: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204063.53428: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found <<< 9396 1727204063.53430: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204063.53433: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found <<< 9396 1727204063.53438: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204063.53497: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 9396 1727204063.53500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204063.53502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204063.53544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204063.55503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204063.55552: stderr chunk (state=3): >>><<< 9396 1727204063.55556: stdout chunk (state=3): >>><<< 9396 1727204063.55571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204063.55579: handler run complete 9396 1727204063.55603: Evaluated conditional (False): False 9396 1727204063.55621: attempt loop complete, returning result 9396 1727204063.55624: _execute() done 9396 1727204063.55626: dumping result to json 9396 1727204063.55631: done dumping result, returning 9396 1727204063.55639: done running TaskExecutor() for managed-node1/TASK: Stop dnsmasq/radvd services [12b410aa-8751-36c5-1f9e-0000000000c7] 9396 1727204063.55644: sending task result for task 12b410aa-8751-36c5-1f9e-0000000000c7 9396 1727204063.55753: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000000c7 9396 1727204063.55755: WORKER PROCESS EXITING ok: [managed-node1] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.030524", "end": "2024-09-24 14:54:23.508019", "rc": 0, "start": "2024-09-24 14:54:23.477495" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 9396 1727204063.55838: no more pending results, returning what we have 9396 1727204063.55842: results queue empty 9396 1727204063.55843: checking for any_errors_fatal 9396 1727204063.55855: done checking for any_errors_fatal 9396 1727204063.55856: checking for max_fail_percentage 9396 1727204063.55858: done checking for max_fail_percentage 9396 1727204063.55859: checking to see if all hosts have failed and the running result is not ok 9396 1727204063.55860: done checking to see if all hosts have failed 9396 1727204063.55861: getting the remaining hosts for this loop 9396 1727204063.55863: done getting the remaining hosts for this loop 9396 1727204063.55874: getting the next task for host managed-node1 9396 1727204063.55883: done getting next task for host managed-node1 9396 1727204063.55886: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 9396 1727204063.55891: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204063.55895: getting variables 9396 1727204063.55897: in VariableManager get_vars() 9396 1727204063.55939: Calling all_inventory to load vars for managed-node1 9396 1727204063.55943: Calling groups_inventory to load vars for managed-node1 9396 1727204063.55946: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204063.55957: Calling all_plugins_play to load vars for managed-node1 9396 1727204063.55960: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204063.55969: Calling groups_plugins_play to load vars for managed-node1 9396 1727204063.57944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204063.59499: done with get_vars() 9396 1727204063.59527: done getting variables 9396 1727204063.59581: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:131 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.444) 0:00:39.567 ***** 9396 1727204063.59612: entering _queue_task() for managed-node1/command 9396 1727204063.59884: worker is 1 (out of 1 available) 9396 1727204063.59900: exiting _queue_task() for managed-node1/command 9396 1727204063.59916: done queuing things up, now waiting for results queue to drain 9396 1727204063.59918: waiting for pending results... 9396 1727204063.60511: running TaskExecutor() for managed-node1/TASK: Restore the /etc/resolv.conf for initscript 9396 1727204063.60595: in run() - task 12b410aa-8751-36c5-1f9e-0000000000c8 9396 1727204063.60600: variable 'ansible_search_path' from source: unknown 9396 1727204063.60605: calling self._execute() 9396 1727204063.60608: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204063.60610: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204063.60614: variable 'omit' from source: magic vars 9396 1727204063.60997: variable 'ansible_distribution_major_version' from source: facts 9396 1727204063.61014: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204063.61167: variable 'network_provider' from source: set_fact 9396 1727204063.61172: Evaluated conditional (network_provider == "initscripts"): False 9396 1727204063.61182: when evaluation is False, skipping this task 9396 1727204063.61186: _execute() done 9396 1727204063.61276: dumping result to json 9396 1727204063.61279: done dumping result, returning 9396 1727204063.61282: done running TaskExecutor() for managed-node1/TASK: Restore the /etc/resolv.conf for initscript [12b410aa-8751-36c5-1f9e-0000000000c8] 9396 1727204063.61285: sending task result for task 12b410aa-8751-36c5-1f9e-0000000000c8 9396 1727204063.61364: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000000c8 9396 1727204063.61367: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 9396 1727204063.61422: no more pending results, returning what we have 9396 1727204063.61428: results queue empty 9396 1727204063.61429: checking for any_errors_fatal 9396 1727204063.61440: done checking for any_errors_fatal 9396 1727204063.61441: checking for max_fail_percentage 9396 1727204063.61443: done checking for max_fail_percentage 9396 1727204063.61444: checking to see if all hosts have failed and the running result is not ok 9396 1727204063.61445: done checking to see if all hosts have failed 9396 1727204063.61446: getting the remaining hosts for this loop 9396 1727204063.61447: done getting the remaining hosts for this loop 9396 1727204063.61452: getting the next task for host managed-node1 9396 1727204063.61459: done getting next task for host managed-node1 9396 1727204063.61463: ^ task is: TASK: Verify network state restored to default 9396 1727204063.61468: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204063.61474: getting variables 9396 1727204063.61475: in VariableManager get_vars() 9396 1727204063.61525: Calling all_inventory to load vars for managed-node1 9396 1727204063.61529: Calling groups_inventory to load vars for managed-node1 9396 1727204063.61532: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204063.61549: Calling all_plugins_play to load vars for managed-node1 9396 1727204063.61553: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204063.61557: Calling groups_plugins_play to load vars for managed-node1 9396 1727204063.63933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204063.67213: done with get_vars() 9396 1727204063.67253: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:136 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.077) 0:00:39.644 ***** 9396 1727204063.67371: entering _queue_task() for managed-node1/include_tasks 9396 1727204063.67741: worker is 1 (out of 1 available) 9396 1727204063.67755: exiting _queue_task() for managed-node1/include_tasks 9396 1727204063.67768: done queuing things up, now waiting for results queue to drain 9396 1727204063.67770: waiting for pending results... 9396 1727204063.68249: running TaskExecutor() for managed-node1/TASK: Verify network state restored to default 9396 1727204063.68409: in run() - task 12b410aa-8751-36c5-1f9e-0000000000c9 9396 1727204063.68423: variable 'ansible_search_path' from source: unknown 9396 1727204063.68464: calling self._execute() 9396 1727204063.68691: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204063.68837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204063.68842: variable 'omit' from source: magic vars 9396 1727204063.69660: variable 'ansible_distribution_major_version' from source: facts 9396 1727204063.69674: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204063.69711: _execute() done 9396 1727204063.69719: dumping result to json 9396 1727204063.69722: done dumping result, returning 9396 1727204063.69725: done running TaskExecutor() for managed-node1/TASK: Verify network state restored to default [12b410aa-8751-36c5-1f9e-0000000000c9] 9396 1727204063.69827: sending task result for task 12b410aa-8751-36c5-1f9e-0000000000c9 9396 1727204063.70011: done sending task result for task 12b410aa-8751-36c5-1f9e-0000000000c9 9396 1727204063.70015: WORKER PROCESS EXITING 9396 1727204063.70059: no more pending results, returning what we have 9396 1727204063.70066: in VariableManager get_vars() 9396 1727204063.70122: Calling all_inventory to load vars for managed-node1 9396 1727204063.70125: Calling groups_inventory to load vars for managed-node1 9396 1727204063.70128: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204063.70146: Calling all_plugins_play to load vars for managed-node1 9396 1727204063.70151: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204063.70155: Calling groups_plugins_play to load vars for managed-node1 9396 1727204063.74011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204063.76963: done with get_vars() 9396 1727204063.76996: variable 'ansible_search_path' from source: unknown 9396 1727204063.77013: we have included files to process 9396 1727204063.77014: generating all_blocks data 9396 1727204063.77018: done generating all_blocks data 9396 1727204063.77024: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 9396 1727204063.77025: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 9396 1727204063.77028: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 9396 1727204063.77534: done processing included file 9396 1727204063.77537: iterating over new_blocks loaded from include file 9396 1727204063.77539: in VariableManager get_vars() 9396 1727204063.77564: done with get_vars() 9396 1727204063.77566: filtering new block on tags 9396 1727204063.77613: done filtering new block on tags 9396 1727204063.77616: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node1 9396 1727204063.77622: extending task lists for all hosts with included blocks 9396 1727204063.79272: done extending task lists 9396 1727204063.79274: done processing included files 9396 1727204063.79275: results queue empty 9396 1727204063.79276: checking for any_errors_fatal 9396 1727204063.79280: done checking for any_errors_fatal 9396 1727204063.79281: checking for max_fail_percentage 9396 1727204063.79282: done checking for max_fail_percentage 9396 1727204063.79283: checking to see if all hosts have failed and the running result is not ok 9396 1727204063.79284: done checking to see if all hosts have failed 9396 1727204063.79285: getting the remaining hosts for this loop 9396 1727204063.79286: done getting the remaining hosts for this loop 9396 1727204063.79291: getting the next task for host managed-node1 9396 1727204063.79297: done getting next task for host managed-node1 9396 1727204063.79300: ^ task is: TASK: Check routes and DNS 9396 1727204063.79303: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204063.79306: getting variables 9396 1727204063.79307: in VariableManager get_vars() 9396 1727204063.79325: Calling all_inventory to load vars for managed-node1 9396 1727204063.79328: Calling groups_inventory to load vars for managed-node1 9396 1727204063.79331: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204063.79340: Calling all_plugins_play to load vars for managed-node1 9396 1727204063.79344: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204063.79348: Calling groups_plugins_play to load vars for managed-node1 9396 1727204063.81457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204063.84878: done with get_vars() 9396 1727204063.85121: done getting variables 9396 1727204063.85180: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:54:23 -0400 (0:00:00.178) 0:00:39.823 ***** 9396 1727204063.85220: entering _queue_task() for managed-node1/shell 9396 1727204063.86006: worker is 1 (out of 1 available) 9396 1727204063.86021: exiting _queue_task() for managed-node1/shell 9396 1727204063.86036: done queuing things up, now waiting for results queue to drain 9396 1727204063.86038: waiting for pending results... 9396 1727204063.87224: running TaskExecutor() for managed-node1/TASK: Check routes and DNS 9396 1727204063.87229: in run() - task 12b410aa-8751-36c5-1f9e-000000000570 9396 1727204063.87232: variable 'ansible_search_path' from source: unknown 9396 1727204063.87234: variable 'ansible_search_path' from source: unknown 9396 1727204063.87304: calling self._execute() 9396 1727204063.87710: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204063.87715: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204063.87720: variable 'omit' from source: magic vars 9396 1727204063.88509: variable 'ansible_distribution_major_version' from source: facts 9396 1727204063.88592: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204063.88609: variable 'omit' from source: magic vars 9396 1727204063.88749: variable 'omit' from source: magic vars 9396 1727204063.88841: variable 'omit' from source: magic vars 9396 1727204063.88946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 9396 1727204063.89053: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 9396 1727204063.89154: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 9396 1727204063.89225: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204063.89244: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 9396 1727204063.89459: variable 'inventory_hostname' from source: host vars for 'managed-node1' 9396 1727204063.89462: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204063.89466: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204063.89770: Set connection var ansible_timeout to 10 9396 1727204063.89774: Set connection var ansible_shell_executable to /bin/sh 9396 1727204063.89776: Set connection var ansible_pipelining to False 9396 1727204063.89778: Set connection var ansible_module_compression to ZIP_DEFLATED 9396 1727204063.89780: Set connection var ansible_connection to ssh 9396 1727204063.89791: Set connection var ansible_shell_type to sh 9396 1727204063.89828: variable 'ansible_shell_executable' from source: unknown 9396 1727204063.89837: variable 'ansible_connection' from source: unknown 9396 1727204063.89884: variable 'ansible_module_compression' from source: unknown 9396 1727204063.89894: variable 'ansible_shell_type' from source: unknown 9396 1727204063.89902: variable 'ansible_shell_executable' from source: unknown 9396 1727204063.89912: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204063.89920: variable 'ansible_pipelining' from source: unknown 9396 1727204063.89927: variable 'ansible_timeout' from source: unknown 9396 1727204063.90097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204063.90279: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204063.90496: variable 'omit' from source: magic vars 9396 1727204063.90499: starting attempt loop 9396 1727204063.90502: running the handler 9396 1727204063.90504: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 9396 1727204063.90510: _low_level_execute_command(): starting 9396 1727204063.90512: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 9396 1727204063.91962: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204063.92072: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204063.92112: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204063.92181: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204063.92302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204063.92369: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204063.94243: stdout chunk (state=3): >>>/root <<< 9396 1727204063.94355: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204063.94755: stderr chunk (state=3): >>><<< 9396 1727204063.94759: stdout chunk (state=3): >>><<< 9396 1727204063.94764: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204063.94766: _low_level_execute_command(): starting 9396 1727204063.94770: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126 `" && echo ansible-tmp-1727204063.9464464-11888-35189740548126="` echo /root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126 `" ) && sleep 0' 9396 1727204063.95984: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204063.96211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204063.96252: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204063.96309: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204063.98380: stdout chunk (state=3): >>>ansible-tmp-1727204063.9464464-11888-35189740548126=/root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126 <<< 9396 1727204063.98518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204063.98584: stderr chunk (state=3): >>><<< 9396 1727204063.98596: stdout chunk (state=3): >>><<< 9396 1727204063.98895: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204063.9464464-11888-35189740548126=/root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204063.98898: variable 'ansible_module_compression' from source: unknown 9396 1727204063.98901: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-9396sveujo_f/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 9396 1727204063.98903: variable 'ansible_facts' from source: unknown 9396 1727204063.99153: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126/AnsiballZ_command.py 9396 1727204063.99521: Sending initial data 9396 1727204063.99524: Sent initial data (154 bytes) 9396 1727204064.00912: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204064.00965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204064.01011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204064.01115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204064.02853: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 9396 1727204064.03027: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 9396 1727204064.03154: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-9396sveujo_f/tmpahte7e1e /root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126/AnsiballZ_command.py <<< 9396 1727204064.03161: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126/AnsiballZ_command.py" <<< 9396 1727204064.03184: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-9396sveujo_f/tmpahte7e1e" to remote "/root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126/AnsiballZ_command.py" <<< 9396 1727204064.06397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204064.06401: stdout chunk (state=3): >>><<< 9396 1727204064.06404: stderr chunk (state=3): >>><<< 9396 1727204064.06526: done transferring module to remote 9396 1727204064.06530: _low_level_execute_command(): starting 9396 1727204064.06533: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126/ /root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126/AnsiballZ_command.py && sleep 0' 9396 1727204064.07699: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204064.07717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204064.07911: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204064.08010: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204064.08306: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204064.10233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204064.10412: stderr chunk (state=3): >>><<< 9396 1727204064.10416: stdout chunk (state=3): >>><<< 9396 1727204064.10436: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204064.10451: _low_level_execute_command(): starting 9396 1727204064.10461: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126/AnsiballZ_command.py && sleep 0' 9396 1727204064.11528: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204064.11804: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204064.12006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204064.12105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204064.31409: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:d4:45:6e:f8:dd brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.210/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3064sec preferred_lft 3064sec\n inet6 fe80::d080:f60d:659:9515/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.210 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.210 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:54:24.303733", "end": "2024-09-24 14:54:24.312908", "delta": "0:00:00.009175", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 9396 1727204064.33699: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. <<< 9396 1727204064.33704: stdout chunk (state=3): >>><<< 9396 1727204064.33709: stderr chunk (state=3): >>><<< 9396 1727204064.33712: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:d4:45:6e:f8:dd brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.11.210/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3064sec preferred_lft 3064sec\n inet6 fe80::d080:f60d:659:9515/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.210 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.210 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:54:24.303733", "end": "2024-09-24 14:54:24.312908", "delta": "0:00:00.009175", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.11.210 closed. 9396 1727204064.33721: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 9396 1727204064.33730: _low_level_execute_command(): starting 9396 1727204064.33738: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204063.9464464-11888-35189740548126/ > /dev/null 2>&1 && sleep 0' 9396 1727204064.35206: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 9396 1727204064.35215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204064.35229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204064.35245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 9396 1727204064.35260: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 <<< 9396 1727204064.35266: stderr chunk (state=3): >>>debug2: match not found <<< 9396 1727204064.35278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 9396 1727204064.35295: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 9396 1727204064.35304: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.11.210 is address <<< 9396 1727204064.35313: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 9396 1727204064.35321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 9396 1727204064.35333: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 9396 1727204064.35580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 9396 1727204064.35592: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 9396 1727204064.35613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 9396 1727204064.35691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 9396 1727204064.37796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 9396 1727204064.37818: stderr chunk (state=3): >>><<< 9396 1727204064.38097: stdout chunk (state=3): >>><<< 9396 1727204064.38101: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.11.210 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.11.210 originally 10.31.11.210 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 9396 1727204064.38104: handler run complete 9396 1727204064.38106: Evaluated conditional (False): False 9396 1727204064.38111: attempt loop complete, returning result 9396 1727204064.38113: _execute() done 9396 1727204064.38115: dumping result to json 9396 1727204064.38117: done dumping result, returning 9396 1727204064.38119: done running TaskExecutor() for managed-node1/TASK: Check routes and DNS [12b410aa-8751-36c5-1f9e-000000000570] 9396 1727204064.38121: sending task result for task 12b410aa-8751-36c5-1f9e-000000000570 ok: [managed-node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009175", "end": "2024-09-24 14:54:24.312908", "rc": 0, "start": "2024-09-24 14:54:24.303733" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:d4:45:6e:f8:dd brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.11.210/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3064sec preferred_lft 3064sec inet6 fe80::d080:f60d:659:9515/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.11.210 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.11.210 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 9396 1727204064.38340: no more pending results, returning what we have 9396 1727204064.38345: results queue empty 9396 1727204064.38346: checking for any_errors_fatal 9396 1727204064.38348: done checking for any_errors_fatal 9396 1727204064.38349: checking for max_fail_percentage 9396 1727204064.38351: done checking for max_fail_percentage 9396 1727204064.38352: checking to see if all hosts have failed and the running result is not ok 9396 1727204064.38353: done checking to see if all hosts have failed 9396 1727204064.38354: getting the remaining hosts for this loop 9396 1727204064.38360: done getting the remaining hosts for this loop 9396 1727204064.38366: getting the next task for host managed-node1 9396 1727204064.38373: done getting next task for host managed-node1 9396 1727204064.38377: ^ task is: TASK: Verify DNS and network connectivity 9396 1727204064.38382: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 9396 1727204064.38388: getting variables 9396 1727204064.38494: in VariableManager get_vars() 9396 1727204064.38543: Calling all_inventory to load vars for managed-node1 9396 1727204064.38546: Calling groups_inventory to load vars for managed-node1 9396 1727204064.38549: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204064.38563: Calling all_plugins_play to load vars for managed-node1 9396 1727204064.38566: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204064.38570: Calling groups_plugins_play to load vars for managed-node1 9396 1727204064.39308: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000570 9396 1727204064.39312: WORKER PROCESS EXITING 9396 1727204064.43210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204064.49238: done with get_vars() 9396 1727204064.49285: done getting variables 9396 1727204064.49363: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.643) 0:00:40.466 ***** 9396 1727204064.49606: entering _queue_task() for managed-node1/shell 9396 1727204064.50365: worker is 1 (out of 1 available) 9396 1727204064.50383: exiting _queue_task() for managed-node1/shell 9396 1727204064.50400: done queuing things up, now waiting for results queue to drain 9396 1727204064.50402: waiting for pending results... 9396 1727204064.50746: running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity 9396 1727204064.50885: in run() - task 12b410aa-8751-36c5-1f9e-000000000571 9396 1727204064.50910: variable 'ansible_search_path' from source: unknown 9396 1727204064.50914: variable 'ansible_search_path' from source: unknown 9396 1727204064.50977: calling self._execute() 9396 1727204064.51102: variable 'ansible_host' from source: host vars for 'managed-node1' 9396 1727204064.51113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node1' 9396 1727204064.51127: variable 'omit' from source: magic vars 9396 1727204064.51629: variable 'ansible_distribution_major_version' from source: facts 9396 1727204064.51644: Evaluated conditional (ansible_distribution_major_version != '6'): True 9396 1727204064.51862: variable 'ansible_facts' from source: unknown 9396 1727204064.53251: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 9396 1727204064.53256: when evaluation is False, skipping this task 9396 1727204064.53260: _execute() done 9396 1727204064.53262: dumping result to json 9396 1727204064.53264: done dumping result, returning 9396 1727204064.53266: done running TaskExecutor() for managed-node1/TASK: Verify DNS and network connectivity [12b410aa-8751-36c5-1f9e-000000000571] 9396 1727204064.53268: sending task result for task 12b410aa-8751-36c5-1f9e-000000000571 9396 1727204064.53344: done sending task result for task 12b410aa-8751-36c5-1f9e-000000000571 9396 1727204064.53349: WORKER PROCESS EXITING skipping: [managed-node1] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 9396 1727204064.53414: no more pending results, returning what we have 9396 1727204064.53420: results queue empty 9396 1727204064.53421: checking for any_errors_fatal 9396 1727204064.53436: done checking for any_errors_fatal 9396 1727204064.53437: checking for max_fail_percentage 9396 1727204064.53439: done checking for max_fail_percentage 9396 1727204064.53440: checking to see if all hosts have failed and the running result is not ok 9396 1727204064.53442: done checking to see if all hosts have failed 9396 1727204064.53443: getting the remaining hosts for this loop 9396 1727204064.53444: done getting the remaining hosts for this loop 9396 1727204064.53450: getting the next task for host managed-node1 9396 1727204064.53576: done getting next task for host managed-node1 9396 1727204064.53581: ^ task is: TASK: meta (flush_handlers) 9396 1727204064.53584: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204064.53591: getting variables 9396 1727204064.53593: in VariableManager get_vars() 9396 1727204064.53642: Calling all_inventory to load vars for managed-node1 9396 1727204064.53646: Calling groups_inventory to load vars for managed-node1 9396 1727204064.53650: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204064.53665: Calling all_plugins_play to load vars for managed-node1 9396 1727204064.53669: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204064.53674: Calling groups_plugins_play to load vars for managed-node1 9396 1727204064.57049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204064.63088: done with get_vars() 9396 1727204064.63126: done getting variables 9396 1727204064.63423: in VariableManager get_vars() 9396 1727204064.63442: Calling all_inventory to load vars for managed-node1 9396 1727204064.63445: Calling groups_inventory to load vars for managed-node1 9396 1727204064.63448: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204064.63455: Calling all_plugins_play to load vars for managed-node1 9396 1727204064.63458: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204064.63462: Calling groups_plugins_play to load vars for managed-node1 9396 1727204064.67423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204064.71085: done with get_vars() 9396 1727204064.71143: done queuing things up, now waiting for results queue to drain 9396 1727204064.71146: results queue empty 9396 1727204064.71147: checking for any_errors_fatal 9396 1727204064.71151: done checking for any_errors_fatal 9396 1727204064.71152: checking for max_fail_percentage 9396 1727204064.71154: done checking for max_fail_percentage 9396 1727204064.71155: checking to see if all hosts have failed and the running result is not ok 9396 1727204064.71156: done checking to see if all hosts have failed 9396 1727204064.71157: getting the remaining hosts for this loop 9396 1727204064.71158: done getting the remaining hosts for this loop 9396 1727204064.71162: getting the next task for host managed-node1 9396 1727204064.71167: done getting next task for host managed-node1 9396 1727204064.71169: ^ task is: TASK: meta (flush_handlers) 9396 1727204064.71171: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204064.71175: getting variables 9396 1727204064.71176: in VariableManager get_vars() 9396 1727204064.71401: Calling all_inventory to load vars for managed-node1 9396 1727204064.71404: Calling groups_inventory to load vars for managed-node1 9396 1727204064.71407: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204064.71416: Calling all_plugins_play to load vars for managed-node1 9396 1727204064.71419: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204064.71423: Calling groups_plugins_play to load vars for managed-node1 9396 1727204064.75416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204064.81171: done with get_vars() 9396 1727204064.81422: done getting variables 9396 1727204064.81494: in VariableManager get_vars() 9396 1727204064.81515: Calling all_inventory to load vars for managed-node1 9396 1727204064.81518: Calling groups_inventory to load vars for managed-node1 9396 1727204064.81521: Calling all_plugins_inventory to load vars for managed-node1 9396 1727204064.81528: Calling all_plugins_play to load vars for managed-node1 9396 1727204064.81531: Calling groups_plugins_inventory to load vars for managed-node1 9396 1727204064.81535: Calling groups_plugins_play to load vars for managed-node1 9396 1727204064.85524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 9396 1727204064.91823: done with get_vars() 9396 1727204064.91865: done queuing things up, now waiting for results queue to drain 9396 1727204064.91868: results queue empty 9396 1727204064.91870: checking for any_errors_fatal 9396 1727204064.91872: done checking for any_errors_fatal 9396 1727204064.91873: checking for max_fail_percentage 9396 1727204064.91874: done checking for max_fail_percentage 9396 1727204064.91875: checking to see if all hosts have failed and the running result is not ok 9396 1727204064.91876: done checking to see if all hosts have failed 9396 1727204064.91877: getting the remaining hosts for this loop 9396 1727204064.91878: done getting the remaining hosts for this loop 9396 1727204064.91888: getting the next task for host managed-node1 9396 1727204064.91894: done getting next task for host managed-node1 9396 1727204064.91895: ^ task is: None 9396 1727204064.91897: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 9396 1727204064.91899: done queuing things up, now waiting for results queue to drain 9396 1727204064.91900: results queue empty 9396 1727204064.91901: checking for any_errors_fatal 9396 1727204064.91902: done checking for any_errors_fatal 9396 1727204064.91903: checking for max_fail_percentage 9396 1727204064.91904: done checking for max_fail_percentage 9396 1727204064.91905: checking to see if all hosts have failed and the running result is not ok 9396 1727204064.91906: done checking to see if all hosts have failed 9396 1727204064.91908: getting the next task for host managed-node1 9396 1727204064.91912: done getting next task for host managed-node1 9396 1727204064.91913: ^ task is: None 9396 1727204064.91914: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node1 : ok=74 changed=3 unreachable=0 failed=0 skipped=62 rescued=0 ignored=0 Tuesday 24 September 2024 14:54:24 -0400 (0:00:00.428) 0:00:40.895 ***** =============================================================================== Install dnsmasq --------------------------------------------------------- 3.40s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 2.51s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.49s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 2.00s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Install pgrep, sysctl --------------------------------------------------- 1.95s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Gathering Facts --------------------------------------------------------- 1.52s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_deprecated_nm.yml:6 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.40s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 1.35s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.20s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.18s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.00s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gathering Facts --------------------------------------------------------- 1.00s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:3 Check if system is ostree ----------------------------------------------- 0.85s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.84s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Check routes and DNS ---------------------------------------------------- 0.64s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.55s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Get stat for interface test1 -------------------------------------------- 0.54s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Get NM profile info ----------------------------------------------------- 0.53s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Delete the device 'deprecated-bond' ------------------------------------- 0.51s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond_deprecated.yml:125 Get NM profile info ----------------------------------------------------- 0.49s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 9396 1727204064.92487: RUNNING CLEANUP