40074 1727204605.92210: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-twx executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 40074 1727204605.92554: Added group all to inventory 40074 1727204605.92556: Added group ungrouped to inventory 40074 1727204605.92559: Group all now contains ungrouped 40074 1727204605.92562: Examining possible inventory source: /tmp/network-6Zh/inventory-Sfc.yml 40074 1727204606.04878: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 40074 1727204606.04931: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 40074 1727204606.04954: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 40074 1727204606.05004: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 40074 1727204606.05067: Loaded config def from plugin (inventory/script) 40074 1727204606.05069: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 40074 1727204606.05105: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 40074 1727204606.05179: Loaded config def from plugin (inventory/yaml) 40074 1727204606.05181: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 40074 1727204606.05253: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 40074 1727204606.05610: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 40074 1727204606.05613: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 40074 1727204606.05615: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 40074 1727204606.05620: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 40074 1727204606.05624: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 40074 1727204606.05678: /tmp/network-6Zh/inventory-Sfc.yml was not parsable by auto 40074 1727204606.05734: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 40074 1727204606.05766: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 40074 1727204606.05838: group all already in inventory 40074 1727204606.05843: set inventory_file for managed-node1 40074 1727204606.05847: set inventory_dir for managed-node1 40074 1727204606.05847: Added host managed-node1 to inventory 40074 1727204606.05849: Added host managed-node1 to group all 40074 1727204606.05850: set ansible_host for managed-node1 40074 1727204606.05851: set ansible_ssh_extra_args for managed-node1 40074 1727204606.05853: set inventory_file for managed-node2 40074 1727204606.05855: set inventory_dir for managed-node2 40074 1727204606.05856: Added host managed-node2 to inventory 40074 1727204606.05857: Added host managed-node2 to group all 40074 1727204606.05857: set ansible_host for managed-node2 40074 1727204606.05858: set ansible_ssh_extra_args for managed-node2 40074 1727204606.05860: set inventory_file for managed-node3 40074 1727204606.05862: set inventory_dir for managed-node3 40074 1727204606.05862: Added host managed-node3 to inventory 40074 1727204606.05863: Added host managed-node3 to group all 40074 1727204606.05864: set ansible_host for managed-node3 40074 1727204606.05864: set ansible_ssh_extra_args for managed-node3 40074 1727204606.05866: Reconcile groups and hosts in inventory. 40074 1727204606.05869: Group ungrouped now contains managed-node1 40074 1727204606.05871: Group ungrouped now contains managed-node2 40074 1727204606.05872: Group ungrouped now contains managed-node3 40074 1727204606.05941: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 40074 1727204606.06045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 40074 1727204606.06084: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 40074 1727204606.06109: Loaded config def from plugin (vars/host_group_vars) 40074 1727204606.06111: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 40074 1727204606.06116: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 40074 1727204606.06123: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 40074 1727204606.06161: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 40074 1727204606.06428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204606.06507: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 40074 1727204606.06539: Loaded config def from plugin (connection/local) 40074 1727204606.06542: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 40074 1727204606.07065: Loaded config def from plugin (connection/paramiko_ssh) 40074 1727204606.07068: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 40074 1727204606.07798: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 40074 1727204606.07831: Loaded config def from plugin (connection/psrp) 40074 1727204606.07834: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 40074 1727204606.08424: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 40074 1727204606.08459: Loaded config def from plugin (connection/ssh) 40074 1727204606.08462: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 40074 1727204606.10073: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 40074 1727204606.10107: Loaded config def from plugin (connection/winrm) 40074 1727204606.10110: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 40074 1727204606.10137: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 40074 1727204606.10193: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 40074 1727204606.10251: Loaded config def from plugin (shell/cmd) 40074 1727204606.10252: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 40074 1727204606.10274: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 40074 1727204606.10334: Loaded config def from plugin (shell/powershell) 40074 1727204606.10336: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 40074 1727204606.10380: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 40074 1727204606.10532: Loaded config def from plugin (shell/sh) 40074 1727204606.10534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 40074 1727204606.10561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 40074 1727204606.10665: Loaded config def from plugin (become/runas) 40074 1727204606.10667: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 40074 1727204606.10824: Loaded config def from plugin (become/su) 40074 1727204606.10827: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 40074 1727204606.10962: Loaded config def from plugin (become/sudo) 40074 1727204606.10964: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 40074 1727204606.10994: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 40074 1727204606.11267: in VariableManager get_vars() 40074 1727204606.11286: done with get_vars() 40074 1727204606.11396: trying /usr/local/lib/python3.12/site-packages/ansible/modules 40074 1727204606.13743: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 40074 1727204606.13838: in VariableManager get_vars() 40074 1727204606.13842: done with get_vars() 40074 1727204606.13844: variable 'playbook_dir' from source: magic vars 40074 1727204606.13845: variable 'ansible_playbook_python' from source: magic vars 40074 1727204606.13845: variable 'ansible_config_file' from source: magic vars 40074 1727204606.13846: variable 'groups' from source: magic vars 40074 1727204606.13847: variable 'omit' from source: magic vars 40074 1727204606.13847: variable 'ansible_version' from source: magic vars 40074 1727204606.13848: variable 'ansible_check_mode' from source: magic vars 40074 1727204606.13848: variable 'ansible_diff_mode' from source: magic vars 40074 1727204606.13849: variable 'ansible_forks' from source: magic vars 40074 1727204606.13849: variable 'ansible_inventory_sources' from source: magic vars 40074 1727204606.13850: variable 'ansible_skip_tags' from source: magic vars 40074 1727204606.13851: variable 'ansible_limit' from source: magic vars 40074 1727204606.13851: variable 'ansible_run_tags' from source: magic vars 40074 1727204606.13852: variable 'ansible_verbosity' from source: magic vars 40074 1727204606.13880: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml 40074 1727204606.14559: in VariableManager get_vars() 40074 1727204606.14572: done with get_vars() 40074 1727204606.14603: in VariableManager get_vars() 40074 1727204606.14613: done with get_vars() 40074 1727204606.14641: in VariableManager get_vars() 40074 1727204606.14652: done with get_vars() 40074 1727204606.14742: in VariableManager get_vars() 40074 1727204606.14754: done with get_vars() 40074 1727204606.14787: in VariableManager get_vars() 40074 1727204606.14799: done with get_vars() 40074 1727204606.14837: in VariableManager get_vars() 40074 1727204606.14847: done with get_vars() 40074 1727204606.14899: in VariableManager get_vars() 40074 1727204606.14910: done with get_vars() 40074 1727204606.14914: variable 'omit' from source: magic vars 40074 1727204606.14928: variable 'omit' from source: magic vars 40074 1727204606.14955: in VariableManager get_vars() 40074 1727204606.14963: done with get_vars() 40074 1727204606.15005: in VariableManager get_vars() 40074 1727204606.15015: done with get_vars() 40074 1727204606.15044: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 40074 1727204606.15219: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 40074 1727204606.15326: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 40074 1727204606.15853: in VariableManager get_vars() 40074 1727204606.15870: done with get_vars() 40074 1727204606.16224: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 40074 1727204606.16340: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 40074 1727204606.18099: in VariableManager get_vars() 40074 1727204606.18113: done with get_vars() 40074 1727204606.18116: variable 'omit' from source: magic vars 40074 1727204606.18125: variable 'omit' from source: magic vars 40074 1727204606.18155: in VariableManager get_vars() 40074 1727204606.18167: done with get_vars() 40074 1727204606.18183: in VariableManager get_vars() 40074 1727204606.18196: done with get_vars() 40074 1727204606.18218: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 40074 1727204606.18308: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 40074 1727204606.18372: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 40074 1727204606.19856: in VariableManager get_vars() 40074 1727204606.19874: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 40074 1727204606.21647: in VariableManager get_vars() 40074 1727204606.21664: done with get_vars() 40074 1727204606.21693: in VariableManager get_vars() 40074 1727204606.21708: done with get_vars() 40074 1727204606.21815: in VariableManager get_vars() 40074 1727204606.21832: done with get_vars() 40074 1727204606.21865: in VariableManager get_vars() 40074 1727204606.21879: done with get_vars() 40074 1727204606.21910: in VariableManager get_vars() 40074 1727204606.21925: done with get_vars() 40074 1727204606.21958: in VariableManager get_vars() 40074 1727204606.21974: done with get_vars() 40074 1727204606.22025: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 40074 1727204606.22039: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 40074 1727204606.22237: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 40074 1727204606.22375: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 40074 1727204606.22377: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 40074 1727204606.22407: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 40074 1727204606.22428: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 40074 1727204606.22576: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 40074 1727204606.22634: Loaded config def from plugin (callback/default) 40074 1727204606.22636: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 40074 1727204606.23601: Loaded config def from plugin (callback/junit) 40074 1727204606.23605: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 40074 1727204606.23643: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 40074 1727204606.23697: Loaded config def from plugin (callback/minimal) 40074 1727204606.23699: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 40074 1727204606.23736: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 40074 1727204606.23784: Loaded config def from plugin (callback/tree) 40074 1727204606.23786: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 40074 1727204606.23895: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 40074 1727204606.23897: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_route_device_nm.yml ******************************************** 2 plays in /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml 40074 1727204606.23918: in VariableManager get_vars() 40074 1727204606.23928: done with get_vars() 40074 1727204606.23936: in VariableManager get_vars() 40074 1727204606.23942: done with get_vars() 40074 1727204606.23946: variable 'omit' from source: magic vars 40074 1727204606.23974: in VariableManager get_vars() 40074 1727204606.23985: done with get_vars() 40074 1727204606.24002: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_route_device.yml' with nm as provider] ***** 40074 1727204606.24441: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 40074 1727204606.24504: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 40074 1727204606.24533: getting the remaining hosts for this loop 40074 1727204606.24535: done getting the remaining hosts for this loop 40074 1727204606.24537: getting the next task for host managed-node2 40074 1727204606.24540: done getting next task for host managed-node2 40074 1727204606.24541: ^ task is: TASK: Gathering Facts 40074 1727204606.24543: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204606.24544: getting variables 40074 1727204606.24545: in VariableManager get_vars() 40074 1727204606.24552: Calling all_inventory to load vars for managed-node2 40074 1727204606.24554: Calling groups_inventory to load vars for managed-node2 40074 1727204606.24556: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204606.24566: Calling all_plugins_play to load vars for managed-node2 40074 1727204606.24576: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204606.24579: Calling groups_plugins_play to load vars for managed-node2 40074 1727204606.24611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204606.24654: done with get_vars() 40074 1727204606.24659: done getting variables 40074 1727204606.24714: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 Tuesday 24 September 2024 15:03:26 -0400 (0:00:00.009) 0:00:00.009 ***** 40074 1727204606.24732: entering _queue_task() for managed-node2/gather_facts 40074 1727204606.24733: Creating lock for gather_facts 40074 1727204606.25042: worker is 1 (out of 1 available) 40074 1727204606.25055: exiting _queue_task() for managed-node2/gather_facts 40074 1727204606.25069: done queuing things up, now waiting for results queue to drain 40074 1727204606.25071: waiting for pending results... 40074 1727204606.25217: running TaskExecutor() for managed-node2/TASK: Gathering Facts 40074 1727204606.25282: in run() - task 12b410aa-8751-9fd7-2501-0000000000bf 40074 1727204606.25297: variable 'ansible_search_path' from source: unknown 40074 1727204606.25329: calling self._execute() 40074 1727204606.25382: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204606.25388: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204606.25398: variable 'omit' from source: magic vars 40074 1727204606.25482: variable 'omit' from source: magic vars 40074 1727204606.25506: variable 'omit' from source: magic vars 40074 1727204606.25539: variable 'omit' from source: magic vars 40074 1727204606.25579: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204606.25612: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204606.25634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204606.25650: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204606.25661: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204606.25691: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204606.25694: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204606.25699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204606.25790: Set connection var ansible_pipelining to False 40074 1727204606.25797: Set connection var ansible_shell_executable to /bin/sh 40074 1727204606.25801: Set connection var ansible_shell_type to sh 40074 1727204606.25804: Set connection var ansible_connection to ssh 40074 1727204606.25811: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204606.25817: Set connection var ansible_timeout to 10 40074 1727204606.25840: variable 'ansible_shell_executable' from source: unknown 40074 1727204606.25844: variable 'ansible_connection' from source: unknown 40074 1727204606.25847: variable 'ansible_module_compression' from source: unknown 40074 1727204606.25853: variable 'ansible_shell_type' from source: unknown 40074 1727204606.25857: variable 'ansible_shell_executable' from source: unknown 40074 1727204606.25860: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204606.25866: variable 'ansible_pipelining' from source: unknown 40074 1727204606.25869: variable 'ansible_timeout' from source: unknown 40074 1727204606.25874: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204606.26027: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204606.26037: variable 'omit' from source: magic vars 40074 1727204606.26043: starting attempt loop 40074 1727204606.26046: running the handler 40074 1727204606.26061: variable 'ansible_facts' from source: unknown 40074 1727204606.26079: _low_level_execute_command(): starting 40074 1727204606.26086: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204606.26647: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204606.26652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204606.26655: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204606.26657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204606.26702: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204606.26716: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204606.26778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204606.28554: stdout chunk (state=3): >>>/root <<< 40074 1727204606.28668: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204606.28722: stderr chunk (state=3): >>><<< 40074 1727204606.28725: stdout chunk (state=3): >>><<< 40074 1727204606.28749: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204606.28760: _low_level_execute_command(): starting 40074 1727204606.28767: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574 `" && echo ansible-tmp-1727204606.2874713-40077-206164311673574="` echo /root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574 `" ) && sleep 0' 40074 1727204606.29241: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204606.29244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204606.29249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204606.29251: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204606.29259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204606.29308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204606.29313: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204606.29353: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204606.31437: stdout chunk (state=3): >>>ansible-tmp-1727204606.2874713-40077-206164311673574=/root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574 <<< 40074 1727204606.31553: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204606.31607: stderr chunk (state=3): >>><<< 40074 1727204606.31610: stdout chunk (state=3): >>><<< 40074 1727204606.31629: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204606.2874713-40077-206164311673574=/root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204606.31660: variable 'ansible_module_compression' from source: unknown 40074 1727204606.31707: ANSIBALLZ: Using generic lock for ansible.legacy.setup 40074 1727204606.31710: ANSIBALLZ: Acquiring lock 40074 1727204606.31713: ANSIBALLZ: Lock acquired: 139809964199616 40074 1727204606.31715: ANSIBALLZ: Creating module 40074 1727204606.57401: ANSIBALLZ: Writing module into payload 40074 1727204606.57505: ANSIBALLZ: Writing module 40074 1727204606.57544: ANSIBALLZ: Renaming module 40074 1727204606.57594: ANSIBALLZ: Done creating module 40074 1727204606.57598: variable 'ansible_facts' from source: unknown 40074 1727204606.57601: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204606.57604: _low_level_execute_command(): starting 40074 1727204606.57606: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 40074 1727204606.58325: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204606.58336: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204606.58408: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204606.58496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204606.58500: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204606.58502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204606.58602: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204606.60427: stdout chunk (state=3): >>>PLATFORM <<< 40074 1727204606.60522: stdout chunk (state=3): >>>Linux <<< 40074 1727204606.60544: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 40074 1727204606.60763: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204606.60784: stderr chunk (state=3): >>><<< 40074 1727204606.60798: stdout chunk (state=3): >>><<< 40074 1727204606.60822: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204606.60844 [managed-node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 40074 1727204606.60916: _low_level_execute_command(): starting 40074 1727204606.60928: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 40074 1727204606.61226: Sending initial data 40074 1727204606.61232: Sent initial data (1181 bytes) 40074 1727204606.61622: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204606.61640: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204606.61701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204606.61715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204606.61733: stderr chunk (state=3): >>>debug2: match found <<< 40074 1727204606.61785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204606.61850: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204606.61876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204606.61911: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204606.61991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204606.65773: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 40074 1727204606.66402: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204606.66406: stdout chunk (state=3): >>><<< 40074 1727204606.66409: stderr chunk (state=3): >>><<< 40074 1727204606.66412: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204606.66470: variable 'ansible_facts' from source: unknown 40074 1727204606.66481: variable 'ansible_facts' from source: unknown 40074 1727204606.66509: variable 'ansible_module_compression' from source: unknown 40074 1727204606.66566: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 40074 1727204606.66607: variable 'ansible_facts' from source: unknown 40074 1727204606.66801: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574/AnsiballZ_setup.py 40074 1727204606.67106: Sending initial data 40074 1727204606.67110: Sent initial data (154 bytes) 40074 1727204606.67750: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204606.67775: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204606.67796: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204606.67848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204606.67935: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204606.67960: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204606.67994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204606.68071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204606.69786: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204606.69871: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204606.69901: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpxwoh2tc5 /root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574/AnsiballZ_setup.py <<< 40074 1727204606.69904: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574/AnsiballZ_setup.py" <<< 40074 1727204606.69950: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpxwoh2tc5" to remote "/root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574/AnsiballZ_setup.py" <<< 40074 1727204606.72508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204606.72528: stderr chunk (state=3): >>><<< 40074 1727204606.72541: stdout chunk (state=3): >>><<< 40074 1727204606.72607: done transferring module to remote 40074 1727204606.72610: _low_level_execute_command(): starting 40074 1727204606.72617: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574/ /root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574/AnsiballZ_setup.py && sleep 0' 40074 1727204606.73310: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204606.73328: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204606.73367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204606.73494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204606.73515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204606.73604: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204606.75771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204606.75775: stdout chunk (state=3): >>><<< 40074 1727204606.75778: stderr chunk (state=3): >>><<< 40074 1727204606.75780: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204606.75783: _low_level_execute_command(): starting 40074 1727204606.75785: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574/AnsiballZ_setup.py && sleep 0' 40074 1727204606.76424: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204606.76445: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204606.76576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204606.76603: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204606.76686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204606.78923: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 40074 1727204606.78952: stdout chunk (state=3): >>>import _imp # builtin <<< 40074 1727204606.79003: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 40074 1727204606.79066: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 40074 1727204606.79134: stdout chunk (state=3): >>>import 'posix' # <<< 40074 1727204606.79157: stdout chunk (state=3): >>>import '_frozen_importlib_external' # # installing zipimport hook <<< 40074 1727204606.79184: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 40074 1727204606.79253: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204606.79277: stdout chunk (state=3): >>>import '_codecs' # <<< 40074 1727204606.79288: stdout chunk (state=3): >>>import 'codecs' # <<< 40074 1727204606.79323: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 40074 1727204606.79371: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 40074 1727204606.79411: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b60c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b5dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b60ea20> <<< 40074 1727204606.79443: stdout chunk (state=3): >>>import '_signal' # <<< 40074 1727204606.79468: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 40074 1727204606.79507: stdout chunk (state=3): >>>import 'io' # <<< 40074 1727204606.79520: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 40074 1727204606.79604: stdout chunk (state=3): >>>import '_collections_abc' # <<< 40074 1727204606.79651: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 40074 1727204606.79699: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 40074 1727204606.79755: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 40074 1727204606.79765: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 40074 1727204606.79795: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4210a0> <<< 40074 1727204606.79860: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 40074 1727204606.79898: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b421fd0> <<< 40074 1727204606.79928: stdout chunk (state=3): >>>import 'site' # <<< 40074 1727204606.79945: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 40074 1727204606.80356: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 40074 1727204606.80374: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 40074 1727204606.80405: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 40074 1727204606.80462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 40074 1727204606.80498: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 40074 1727204606.80552: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b45fda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py <<< 40074 1727204606.80579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 40074 1727204606.80603: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b45ffe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 40074 1727204606.80643: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 40074 1727204606.80648: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 40074 1727204606.80729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204606.80771: stdout chunk (state=3): >>>import 'itertools' # <<< 40074 1727204606.80780: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4977a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 40074 1727204606.80808: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b497e30> import '_collections' # <<< 40074 1727204606.80858: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b477a40> <<< 40074 1727204606.80878: stdout chunk (state=3): >>>import '_functools' # <<< 40074 1727204606.80904: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b475190> <<< 40074 1727204606.80985: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b45cf50> <<< 40074 1727204606.81036: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 40074 1727204606.81059: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 40074 1727204606.81092: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 40074 1727204606.81118: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 40074 1727204606.81160: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4bb650> <<< 40074 1727204606.81200: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ba270> <<< 40074 1727204606.81216: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b476180> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4b8b30> <<< 40074 1727204606.81279: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 40074 1727204606.81310: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ec650> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b45c1d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 40074 1727204606.81338: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204606.81370: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b4ecb00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ec9b0> <<< 40074 1727204606.81395: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b4ecd70> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b45ad20> <<< 40074 1727204606.81445: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 40074 1727204606.81448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204606.81537: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 40074 1727204606.81541: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ed430> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ed100> import 'importlib.machinery' # <<< 40074 1727204606.81548: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 40074 1727204606.81578: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ee330> import 'importlib.util' # import 'runpy' # <<< 40074 1727204606.81611: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 40074 1727204606.81649: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 40074 1727204606.81707: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b508560> <<< 40074 1727204606.81712: stdout chunk (state=3): >>>import 'errno' # <<< 40074 1727204606.81746: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b509ca0> <<< 40074 1727204606.81768: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 40074 1727204606.81799: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b50aba0> <<< 40074 1727204606.81849: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204606.81882: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b50b200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b50a0f0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 40074 1727204606.81928: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204606.81941: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b50bc80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b50b3b0> <<< 40074 1727204606.82002: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ee390> <<< 40074 1727204606.82037: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 40074 1727204606.82049: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 40074 1727204606.82080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 40074 1727204606.82156: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b25fbf0> <<< 40074 1727204606.82159: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 40074 1727204606.82221: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b28c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b28c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204606.82269: stdout chunk (state=3): >>># extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b28c6e0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b28c8c0> <<< 40074 1727204606.82273: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b25dd90> <<< 40074 1727204606.82286: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 40074 1727204606.82394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 40074 1727204606.82428: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b28dfd0> <<< 40074 1727204606.82469: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b28cc50> <<< 40074 1727204606.82485: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4eea80> <<< 40074 1727204606.82511: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 40074 1727204606.82579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204606.82586: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 40074 1727204606.82620: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 40074 1727204606.82660: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b2ba300> <<< 40074 1727204606.82729: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 40074 1727204606.82739: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204606.82770: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 40074 1727204606.82856: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b2ce480> <<< 40074 1727204606.82874: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 40074 1727204606.82952: stdout chunk (state=3): >>>import 'ntpath' # <<< 40074 1727204606.82974: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b30b230> <<< 40074 1727204606.83010: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 40074 1727204606.83035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 40074 1727204606.83049: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 40074 1727204606.83095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 40074 1727204606.83182: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b32d9d0> <<< 40074 1727204606.83259: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b30b350> <<< 40074 1727204606.83322: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b2cf110> <<< 40074 1727204606.83364: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' <<< 40074 1727204606.83377: stdout chunk (state=3): >>>import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b11c380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b2cd4c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b28eea0> <<< 40074 1727204606.83535: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 40074 1727204606.83567: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5e7b11c620> <<< 40074 1727204606.83733: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_rkxe44u8/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 40074 1727204606.83889: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.83923: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 40074 1727204606.83965: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 40074 1727204606.84064: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 40074 1727204606.84088: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b1820f0> <<< 40074 1727204606.84108: stdout chunk (state=3): >>>import '_typing' # <<< 40074 1727204606.84315: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b158fe0> <<< 40074 1727204606.84321: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b158170> # zipimport: zlib available <<< 40074 1727204606.84367: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available <<< 40074 1727204606.84387: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # <<< 40074 1727204606.84413: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.85966: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.87281: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 40074 1727204606.87334: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b15bf50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204606.87358: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 40074 1727204606.87393: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 40074 1727204606.87408: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b1b59d0> <<< 40074 1727204606.87447: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b1b5760> <<< 40074 1727204606.87490: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b1b50a0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 40074 1727204606.87514: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 40074 1727204606.87556: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b1b5af0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b182b10> <<< 40074 1727204606.87597: stdout chunk (state=3): >>>import 'atexit' # <<< 40074 1727204606.87638: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b1b6780> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b1b69c0> <<< 40074 1727204606.87641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 40074 1727204606.87695: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 40074 1727204606.87699: stdout chunk (state=3): >>>import '_locale' # <<< 40074 1727204606.87758: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b1b6ea0> <<< 40074 1727204606.87785: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 40074 1727204606.87818: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 40074 1727204606.87835: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b014bf0> <<< 40074 1727204606.87882: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b016810> <<< 40074 1727204606.87908: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 40074 1727204606.87952: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b0171a0> <<< 40074 1727204606.87970: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 40074 1727204606.88017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b018380> <<< 40074 1727204606.88040: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 40074 1727204606.88071: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 40074 1727204606.88102: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 40074 1727204606.88146: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b01ae10> <<< 40074 1727204606.88212: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b01b170> <<< 40074 1727204606.88226: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b019100> <<< 40074 1727204606.88264: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 40074 1727204606.88317: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 40074 1727204606.88322: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 40074 1727204606.88368: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 40074 1727204606.88384: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b01ede0> import '_tokenize' # <<< 40074 1727204606.88473: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b01d8b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b01d610> <<< 40074 1727204606.88495: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 40074 1727204606.88560: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b01fda0> <<< 40074 1727204606.88615: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b0195e0> <<< 40074 1727204606.88652: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b062f90> <<< 40074 1727204606.88680: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b0631a0> <<< 40074 1727204606.88714: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 40074 1727204606.88742: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 40074 1727204606.88779: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b06cc80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b06ca40> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 40074 1727204606.88914: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 40074 1727204606.88970: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b06f230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b06d370> <<< 40074 1727204606.89002: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 40074 1727204606.89062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204606.89094: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 40074 1727204606.89112: stdout chunk (state=3): >>>import '_string' # <<< 40074 1727204606.89141: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b072a50> <<< 40074 1727204606.89301: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b06f3e0> <<< 40074 1727204606.89380: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b073830> <<< 40074 1727204606.89418: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b073860> <<< 40074 1727204606.89472: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204606.89497: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b073e00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b063380> <<< 40074 1727204606.89540: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 40074 1727204606.89555: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 40074 1727204606.89598: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204606.89627: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b076ab0> <<< 40074 1727204606.89835: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b077e90> <<< 40074 1727204606.89877: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b075250> <<< 40074 1727204606.89899: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b076600> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b074e00> # zipimport: zlib available <<< 40074 1727204606.89934: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 40074 1727204606.90045: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.90176: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.90180: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 40074 1727204606.90213: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 40074 1727204606.90353: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.90496: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.91188: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.91892: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 40074 1727204606.91902: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 40074 1727204606.91946: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 40074 1727204606.91949: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204606.92005: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7aefc170> <<< 40074 1727204606.92116: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py <<< 40074 1727204606.92146: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7aefd670> <<< 40074 1727204606.92164: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b07a9c0> <<< 40074 1727204606.92214: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 40074 1727204606.92257: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.92260: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.92288: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 40074 1727204606.92451: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.92662: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 40074 1727204606.92666: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7aefd6a0> <<< 40074 1727204606.92681: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.93244: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.93788: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.93873: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.93975: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 40074 1727204606.93979: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.94015: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.94061: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 40074 1727204606.94160: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.94303: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 40074 1727204606.94307: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.94318: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 40074 1727204606.94358: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.94408: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 40074 1727204606.94420: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.94703: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.94979: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 40074 1727204606.95063: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 40074 1727204606.95075: stdout chunk (state=3): >>>import '_ast' # <<< 40074 1727204606.95167: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7aeff890> <<< 40074 1727204606.95184: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.95259: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.95368: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 40074 1727204606.95403: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 40074 1727204606.95479: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204606.95614: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7af05cd0> <<< 40074 1727204606.95688: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7af065d0> <<< 40074 1727204606.95694: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7aefe630> <<< 40074 1727204606.95714: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.95748: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.95798: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 40074 1727204606.95802: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.95846: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.95886: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.95954: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.96025: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 40074 1727204606.96070: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204606.96168: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7af05370> <<< 40074 1727204606.96211: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af06840> <<< 40074 1727204606.96250: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 40074 1727204606.96326: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.96396: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.96437: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.96501: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204606.96532: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 40074 1727204606.96553: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 40074 1727204606.96567: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 40074 1727204606.96634: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 40074 1727204606.96668: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 40074 1727204606.96671: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 40074 1727204606.96727: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af9eb40> <<< 40074 1727204606.96777: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af10890> <<< 40074 1727204606.96873: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af0fc80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af0fb00> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 40074 1727204606.96912: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.96916: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.96955: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 40074 1727204606.97029: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 40074 1727204606.97057: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 40074 1727204606.97074: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.97127: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.97211: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.97228: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.97248: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.97292: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.97335: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.97374: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.97429: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 40074 1727204606.97442: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.97515: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.97608: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.97624: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.97670: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 40074 1727204606.97682: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.97885: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.98080: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.98125: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.98191: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204606.98243: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 40074 1727204606.98248: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 40074 1727204606.98293: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 40074 1727204606.98319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 40074 1727204606.98323: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7afa5220> <<< 40074 1727204606.98375: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 40074 1727204606.98379: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 40074 1727204606.98444: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 40074 1727204606.98449: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 40074 1727204606.98483: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a5042c0> <<< 40074 1727204606.98525: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204606.98529: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a504890> <<< 40074 1727204606.98600: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af15460> <<< 40074 1727204606.98623: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af148c0> <<< 40074 1727204606.98657: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7afa7440> <<< 40074 1727204606.98676: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7afa6ab0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 40074 1727204606.98758: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 40074 1727204606.98781: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 40074 1727204606.98833: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 40074 1727204606.98856: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a507530> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a506e10> <<< 40074 1727204606.98919: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204606.98923: stdout chunk (state=3): >>>import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a506fc0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a506240> <<< 40074 1727204606.98950: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 40074 1727204606.99071: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 40074 1727204606.99110: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a507680> <<< 40074 1727204606.99113: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 40074 1727204606.99139: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 40074 1727204606.99179: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a5721b0> <<< 40074 1727204606.99208: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a5701d0> <<< 40074 1727204606.99244: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a503f80> import 'ansible.module_utils.facts.timeout' # <<< 40074 1727204606.99284: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 40074 1727204606.99318: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 40074 1727204606.99376: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.99455: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 40074 1727204606.99459: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.99525: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.99594: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 40074 1727204606.99597: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.99628: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 40074 1727204606.99655: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.99700: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available <<< 40074 1727204606.99754: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.99815: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available <<< 40074 1727204606.99870: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.99929: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 40074 1727204606.99935: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204606.99991: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.00057: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.00119: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.00203: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # <<< 40074 1727204607.00206: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.00761: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.01278: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 40074 1727204607.01282: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.01341: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.01399: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.01441: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.01483: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 40074 1727204607.01541: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.01548: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.01573: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 40074 1727204607.01587: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.01635: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.01706: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 40074 1727204607.01711: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.01748: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.01793: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 40074 1727204607.01805: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.01816: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.01861: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available <<< 40074 1727204607.01956: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.02055: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 40074 1727204607.02070: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 40074 1727204607.02110: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a5736b0> <<< 40074 1727204607.02114: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 40074 1727204607.02143: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 40074 1727204607.02279: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a572cc0> <<< 40074 1727204607.02293: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 40074 1727204607.02363: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.02438: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 40074 1727204607.02454: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.02549: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.02651: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 40074 1727204607.02664: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.02734: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.02819: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 40074 1727204607.02831: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.02871: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.02922: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 40074 1727204607.02978: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 40074 1727204607.03059: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204607.03127: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a59e330> <<< 40074 1727204607.03345: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a58b200> import 'ansible.module_utils.facts.system.python' # <<< 40074 1727204607.03357: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.03421: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.03481: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 40074 1727204607.03502: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.03584: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.03678: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.03806: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.03971: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 40074 1727204607.03996: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.04033: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.04077: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 40074 1727204607.04098: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.04134: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.04187: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 40074 1727204607.04227: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204607.04248: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a5b9f10> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a59c230> <<< 40074 1727204607.04283: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available <<< 40074 1727204607.04319: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available <<< 40074 1727204607.04361: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.04409: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 40074 1727204607.04428: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.04597: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.04777: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 40074 1727204607.04781: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.04888: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.04999: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.05051: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.05095: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 40074 1727204607.05113: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 40074 1727204607.05136: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.05162: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.05323: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.05488: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 40074 1727204607.05510: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.05637: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.05782: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 40074 1727204607.05785: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.05827: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.05862: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.06507: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.07092: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 40074 1727204607.07111: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.07222: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.07344: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 40074 1727204607.07365: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.07461: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.07576: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 40074 1727204607.07599: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.07754: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.07936: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 40074 1727204607.07967: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 40074 1727204607.07986: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.08028: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.08090: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 40074 1727204607.08094: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.08193: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.08307: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.08534: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.08765: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 40074 1727204607.08787: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 40074 1727204607.08826: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.08875: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 40074 1727204607.08878: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.08920: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.08945: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 40074 1727204607.08948: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.09016: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.09098: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 40074 1727204607.09117: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.09140: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.09176: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available <<< 40074 1727204607.09236: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.09305: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 40074 1727204607.09371: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.09435: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 40074 1727204607.09456: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.09752: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10067: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 40074 1727204607.10070: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10127: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10209: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 40074 1727204607.10213: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10243: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10284: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 40074 1727204607.10298: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10338: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10367: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 40074 1727204607.10382: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10411: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10464: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 40074 1727204607.10468: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10556: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10654: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 40074 1727204607.10659: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10693: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 40074 1727204607.10742: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10811: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 40074 1727204607.10814: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10847: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10850: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10906: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.10956: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.11040: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.11131: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # <<< 40074 1727204607.11152: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 40074 1727204607.11204: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.11272: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 40074 1727204607.11276: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.11494: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.11726: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 40074 1727204607.11730: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.11780: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.11843: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 40074 1727204607.11846: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.11894: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.11950: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 40074 1727204607.11953: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.12050: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.12135: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 40074 1727204607.12158: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.12252: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.12353: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 40074 1727204607.12448: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204607.12832: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 40074 1727204607.12838: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 40074 1727204607.12854: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 40074 1727204607.12917: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a34b5c0> <<< 40074 1727204607.12921: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a349e80> <<< 40074 1727204607.12970: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a34b710> <<< 40074 1727204607.31539: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 40074 1727204607.31596: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a391cd0> <<< 40074 1727204607.31600: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 40074 1727204607.31635: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 40074 1727204607.31639: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a3939b0> <<< 40074 1727204607.31694: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 40074 1727204607.31721: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204607.31755: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 40074 1727204607.31792: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a5a5190> <<< 40074 1727204607.31811: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a5a4cb0> <<< 40074 1727204607.32075: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 40074 1727204607.52598: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.4453125, "5m": 0.60107421875, "15m": 0.46630859375}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.f<<< 40074 1727204607.52655: stdout chunk (state=3): >>>c39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0", "rpltstbr"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumm<<< 40074 1727204607.52686: stdout chunk (state=3): >>>ing": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "ee:fa:4b:42:85:80", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2796, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 921, "free": 2796}, "nocache": {"free": 3457, "used": 260}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1111, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251124867072, "block_size": 4096, "block_total": 64479564, "block_available": 61309782, "block_used": 3169782, "inode_total": 16384000, "inode_available": 16302100, "inode_used": 81900, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_local": {}, "ansible_hostnqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "27", "epoch": "1727204607", "epoch_int": "1727204607", "date": "2024-09-24", "time": "15:03:27", "iso8601_micro": "2024-09-24T19:03:27.522402Z", "iso8601": "2024-09-24T19:03:27Z", "iso8601_basic": "20240924T150327522402", "iso8601_basic_short": "20240924T150327", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 40074 1727204607.53273: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 40074 1727204607.53309: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin <<< 40074 1727204607.53340: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat <<< 40074 1727204607.53381: stdout chunk (state=3): >>># cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler <<< 40074 1727204607.53384: stdout chunk (state=3): >>># cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref <<< 40074 1727204607.53456: stdout chunk (state=3): >>># cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six <<< 40074 1727204607.53462: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy <<< 40074 1727204607.53523: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux <<< 40074 1727204607.53528: stdout chunk (state=3): >>># cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 40074 1727204607.53594: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter <<< 40074 1727204607.53641: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob <<< 40074 1727204607.53646: stdout chunk (state=3): >>># cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl <<< 40074 1727204607.53672: stdout chunk (state=3): >>># destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors <<< 40074 1727204607.53724: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb <<< 40074 1727204607.53749: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd <<< 40074 1727204607.53795: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly <<< 40074 1727204607.53799: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 40074 1727204607.54176: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 40074 1727204607.54180: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 40074 1727204607.54204: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 40074 1727204607.54228: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 40074 1727204607.54284: stdout chunk (state=3): >>># destroy ntpath <<< 40074 1727204607.54291: stdout chunk (state=3): >>># destroy importlib <<< 40074 1727204607.54343: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal <<< 40074 1727204607.54347: stdout chunk (state=3): >>># destroy _posixsubprocess <<< 40074 1727204607.54365: stdout chunk (state=3): >>># destroy syslog # destroy uuid <<< 40074 1727204607.54412: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 40074 1727204607.54427: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 40074 1727204607.54462: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector<<< 40074 1727204607.54486: stdout chunk (state=3): >>> # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle <<< 40074 1727204607.54527: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue <<< 40074 1727204607.54552: stdout chunk (state=3): >>># destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 40074 1727204607.54578: stdout chunk (state=3): >>># destroy _ssl <<< 40074 1727204607.54612: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios <<< 40074 1727204607.54647: stdout chunk (state=3): >>># destroy json <<< 40074 1727204607.54677: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array <<< 40074 1727204607.54688: stdout chunk (state=3): >>># destroy multiprocessing.dummy.connection <<< 40074 1727204607.54727: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep <<< 40074 1727204607.54772: stdout chunk (state=3): >>># cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 40074 1727204607.54802: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect <<< 40074 1727204607.54845: stdout chunk (state=3): >>># cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser <<< 40074 1727204607.54850: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc<<< 40074 1727204607.54899: stdout chunk (state=3): >>> # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io <<< 40074 1727204607.54911: stdout chunk (state=3): >>># destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 40074 1727204607.55369: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 40074 1727204607.55442: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re <<< 40074 1727204607.55446: stdout chunk (state=3): >>># destroy itertools # destroy _abc <<< 40074 1727204607.55448: stdout chunk (state=3): >>># destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 40074 1727204607.55919: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204607.55932: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 40074 1727204607.56163: stderr chunk (state=3): >>><<< 40074 1727204607.56167: stdout chunk (state=3): >>><<< 40074 1727204607.56462: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b60c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b5dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b60ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4210a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b421fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b45fda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b45ffe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4977a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b497e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b477a40> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b475190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b45cf50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4bb650> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ba270> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b476180> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4b8b30> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ec650> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b45c1d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b4ecb00> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ec9b0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b4ecd70> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b45ad20> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ed430> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ed100> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ee330> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b508560> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b509ca0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b50aba0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b50b200> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b50a0f0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b50bc80> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b50b3b0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4ee390> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b25fbf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b28c6b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b28c410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b28c6e0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b28c8c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b25dd90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b28dfd0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b28cc50> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b4eea80> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b2ba300> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b2ce480> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b30b230> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b32d9d0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b30b350> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b2cf110> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b11c380> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b2cd4c0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b28eea0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5e7b11c620> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_rkxe44u8/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b1820f0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b158fe0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b158170> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b15bf50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b1b59d0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b1b5760> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b1b50a0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b1b5af0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b182b10> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b1b6780> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b1b69c0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b1b6ea0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b014bf0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b016810> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b0171a0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b018380> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b01ae10> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b01b170> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b019100> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b01ede0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b01d8b0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b01d610> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b01fda0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b0195e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b062f90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b0631a0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b06cc80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b06ca40> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b06f230> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b06d370> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b072a50> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b06f3e0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b073830> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b073860> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b073e00> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b063380> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b076ab0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b077e90> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b075250> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7b076600> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b074e00> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7aefc170> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7aefd670> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7b07a9c0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7aefd6a0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7aeff890> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7af05cd0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7af065d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7aefe630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7af05370> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af06840> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af9eb40> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af10890> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af0fc80> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af0fb00> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7afa5220> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a5042c0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a504890> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af15460> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7af148c0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7afa7440> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7afa6ab0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a507530> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a506e10> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a506fc0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a506240> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a507680> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a5721b0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a5701d0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a503f80> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a5736b0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a572cc0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a59e330> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a58b200> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a5b9f10> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a59c230> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5e7a34b5c0> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a349e80> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a34b710> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a391cd0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a3939b0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a5a5190> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5e7a5a4cb0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.4453125, "5m": 0.60107421875, "15m": 0.46630859375}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_lsb": {}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_pkg_mgr": "dnf", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fibre_channel_wwn": [], "ansible_interfaces": ["lo", "eth0", "rpltstbr"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "ee:fa:4b:42:85:80", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2796, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 921, "free": 2796}, "nocache": {"free": 3457, "used": 260}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1111, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251124867072, "block_size": 4096, "block_total": 64479564, "block_available": 61309782, "block_used": 3169782, "inode_total": 16384000, "inode_available": 16302100, "inode_used": 81900, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_local": {}, "ansible_hostnqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "27", "epoch": "1727204607", "epoch_int": "1727204607", "date": "2024-09-24", "time": "15:03:27", "iso8601_micro": "2024-09-24T19:03:27.522402Z", "iso8601": "2024-09-24T19:03:27Z", "iso8601_basic": "20240924T150327522402", "iso8601_basic_short": "20240924T150327", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 40074 1727204607.60262: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204607.60295: _low_level_execute_command(): starting 40074 1727204607.60312: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204606.2874713-40077-206164311673574/ > /dev/null 2>&1 && sleep 0' 40074 1727204607.61083: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204607.61161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204607.61184: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204607.61302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204607.61508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204607.64280: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204607.64320: stdout chunk (state=3): >>><<< 40074 1727204607.64324: stderr chunk (state=3): >>><<< 40074 1727204607.64346: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204607.64434: handler run complete 40074 1727204607.64574: variable 'ansible_facts' from source: unknown 40074 1727204607.64682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204607.65007: variable 'ansible_facts' from source: unknown 40074 1727204607.65086: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204607.65221: attempt loop complete, returning result 40074 1727204607.65225: _execute() done 40074 1727204607.65228: dumping result to json 40074 1727204607.65259: done dumping result, returning 40074 1727204607.65267: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-9fd7-2501-0000000000bf] 40074 1727204607.65271: sending task result for task 12b410aa-8751-9fd7-2501-0000000000bf 40074 1727204607.65985: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000bf ok: [managed-node2] 40074 1727204607.65996: WORKER PROCESS EXITING 40074 1727204607.66115: no more pending results, returning what we have 40074 1727204607.66121: results queue empty 40074 1727204607.66122: checking for any_errors_fatal 40074 1727204607.66123: done checking for any_errors_fatal 40074 1727204607.66123: checking for max_fail_percentage 40074 1727204607.66125: done checking for max_fail_percentage 40074 1727204607.66125: checking to see if all hosts have failed and the running result is not ok 40074 1727204607.66126: done checking to see if all hosts have failed 40074 1727204607.66128: getting the remaining hosts for this loop 40074 1727204607.66129: done getting the remaining hosts for this loop 40074 1727204607.66133: getting the next task for host managed-node2 40074 1727204607.66139: done getting next task for host managed-node2 40074 1727204607.66140: ^ task is: TASK: meta (flush_handlers) 40074 1727204607.66142: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204607.66145: getting variables 40074 1727204607.66146: in VariableManager get_vars() 40074 1727204607.66162: Calling all_inventory to load vars for managed-node2 40074 1727204607.66164: Calling groups_inventory to load vars for managed-node2 40074 1727204607.66167: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204607.66174: Calling all_plugins_play to load vars for managed-node2 40074 1727204607.66177: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204607.66179: Calling groups_plugins_play to load vars for managed-node2 40074 1727204607.66345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204607.66606: done with get_vars() 40074 1727204607.66618: done getting variables 40074 1727204607.66698: in VariableManager get_vars() 40074 1727204607.66708: Calling all_inventory to load vars for managed-node2 40074 1727204607.66711: Calling groups_inventory to load vars for managed-node2 40074 1727204607.66714: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204607.66720: Calling all_plugins_play to load vars for managed-node2 40074 1727204607.66723: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204607.66726: Calling groups_plugins_play to load vars for managed-node2 40074 1727204607.66955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204607.67859: done with get_vars() 40074 1727204607.67876: done queuing things up, now waiting for results queue to drain 40074 1727204607.67879: results queue empty 40074 1727204607.67880: checking for any_errors_fatal 40074 1727204607.67883: done checking for any_errors_fatal 40074 1727204607.67884: checking for max_fail_percentage 40074 1727204607.67885: done checking for max_fail_percentage 40074 1727204607.67886: checking to see if all hosts have failed and the running result is not ok 40074 1727204607.67887: done checking to see if all hosts have failed 40074 1727204607.67888: getting the remaining hosts for this loop 40074 1727204607.67896: done getting the remaining hosts for this loop 40074 1727204607.67899: getting the next task for host managed-node2 40074 1727204607.67906: done getting next task for host managed-node2 40074 1727204607.67908: ^ task is: TASK: Include the task 'el_repo_setup.yml' 40074 1727204607.67910: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204607.67913: getting variables 40074 1727204607.67914: in VariableManager get_vars() 40074 1727204607.67924: Calling all_inventory to load vars for managed-node2 40074 1727204607.67926: Calling groups_inventory to load vars for managed-node2 40074 1727204607.67930: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204607.67935: Calling all_plugins_play to load vars for managed-node2 40074 1727204607.67939: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204607.67942: Calling groups_plugins_play to load vars for managed-node2 40074 1727204607.68679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204607.69361: done with get_vars() 40074 1727204607.69371: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:11 Tuesday 24 September 2024 15:03:27 -0400 (0:00:01.448) 0:00:01.457 ***** 40074 1727204607.69610: entering _queue_task() for managed-node2/include_tasks 40074 1727204607.69612: Creating lock for include_tasks 40074 1727204607.69994: worker is 1 (out of 1 available) 40074 1727204607.70009: exiting _queue_task() for managed-node2/include_tasks 40074 1727204607.70022: done queuing things up, now waiting for results queue to drain 40074 1727204607.70024: waiting for pending results... 40074 1727204607.70810: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 40074 1727204607.71002: in run() - task 12b410aa-8751-9fd7-2501-000000000006 40074 1727204607.71007: variable 'ansible_search_path' from source: unknown 40074 1727204607.71160: calling self._execute() 40074 1727204607.71304: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204607.71329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204607.71395: variable 'omit' from source: magic vars 40074 1727204607.71708: _execute() done 40074 1727204607.71712: dumping result to json 40074 1727204607.71717: done dumping result, returning 40074 1727204607.71781: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [12b410aa-8751-9fd7-2501-000000000006] 40074 1727204607.71796: sending task result for task 12b410aa-8751-9fd7-2501-000000000006 40074 1727204607.72251: done sending task result for task 12b410aa-8751-9fd7-2501-000000000006 40074 1727204607.72256: WORKER PROCESS EXITING 40074 1727204607.72368: no more pending results, returning what we have 40074 1727204607.72373: in VariableManager get_vars() 40074 1727204607.72409: Calling all_inventory to load vars for managed-node2 40074 1727204607.72413: Calling groups_inventory to load vars for managed-node2 40074 1727204607.72418: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204607.72434: Calling all_plugins_play to load vars for managed-node2 40074 1727204607.72438: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204607.72443: Calling groups_plugins_play to load vars for managed-node2 40074 1727204607.73732: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204607.74071: done with get_vars() 40074 1727204607.74081: variable 'ansible_search_path' from source: unknown 40074 1727204607.74099: we have included files to process 40074 1727204607.74101: generating all_blocks data 40074 1727204607.74102: done generating all_blocks data 40074 1727204607.74104: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 40074 1727204607.74105: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 40074 1727204607.74108: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 40074 1727204607.74945: in VariableManager get_vars() 40074 1727204607.74962: done with get_vars() 40074 1727204607.74974: done processing included file 40074 1727204607.74976: iterating over new_blocks loaded from include file 40074 1727204607.74977: in VariableManager get_vars() 40074 1727204607.74988: done with get_vars() 40074 1727204607.74991: filtering new block on tags 40074 1727204607.75007: done filtering new block on tags 40074 1727204607.75010: in VariableManager get_vars() 40074 1727204607.75020: done with get_vars() 40074 1727204607.75022: filtering new block on tags 40074 1727204607.75041: done filtering new block on tags 40074 1727204607.75044: in VariableManager get_vars() 40074 1727204607.75056: done with get_vars() 40074 1727204607.75057: filtering new block on tags 40074 1727204607.75075: done filtering new block on tags 40074 1727204607.75078: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 40074 1727204607.75084: extending task lists for all hosts with included blocks 40074 1727204607.75150: done extending task lists 40074 1727204607.75151: done processing included files 40074 1727204607.75152: results queue empty 40074 1727204607.75153: checking for any_errors_fatal 40074 1727204607.75155: done checking for any_errors_fatal 40074 1727204607.75156: checking for max_fail_percentage 40074 1727204607.75158: done checking for max_fail_percentage 40074 1727204607.75159: checking to see if all hosts have failed and the running result is not ok 40074 1727204607.75160: done checking to see if all hosts have failed 40074 1727204607.75160: getting the remaining hosts for this loop 40074 1727204607.75163: done getting the remaining hosts for this loop 40074 1727204607.75166: getting the next task for host managed-node2 40074 1727204607.75171: done getting next task for host managed-node2 40074 1727204607.75173: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 40074 1727204607.75176: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204607.75179: getting variables 40074 1727204607.75180: in VariableManager get_vars() 40074 1727204607.75192: Calling all_inventory to load vars for managed-node2 40074 1727204607.75195: Calling groups_inventory to load vars for managed-node2 40074 1727204607.75198: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204607.75207: Calling all_plugins_play to load vars for managed-node2 40074 1727204607.75210: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204607.75215: Calling groups_plugins_play to load vars for managed-node2 40074 1727204607.75474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204607.76143: done with get_vars() 40074 1727204607.76154: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 15:03:27 -0400 (0:00:00.069) 0:00:01.527 ***** 40074 1727204607.76542: entering _queue_task() for managed-node2/setup 40074 1727204607.77015: worker is 1 (out of 1 available) 40074 1727204607.77028: exiting _queue_task() for managed-node2/setup 40074 1727204607.77040: done queuing things up, now waiting for results queue to drain 40074 1727204607.77042: waiting for pending results... 40074 1727204607.77362: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 40074 1727204607.77542: in run() - task 12b410aa-8751-9fd7-2501-0000000000d0 40074 1727204607.77546: variable 'ansible_search_path' from source: unknown 40074 1727204607.77549: variable 'ansible_search_path' from source: unknown 40074 1727204607.77557: calling self._execute() 40074 1727204607.77625: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204607.77635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204607.77649: variable 'omit' from source: magic vars 40074 1727204607.78297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204607.81098: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204607.81102: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204607.81105: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204607.81137: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204607.81167: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204607.81263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204607.81307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204607.81345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204607.81398: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204607.81416: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204607.81639: variable 'ansible_facts' from source: unknown 40074 1727204607.81741: variable 'network_test_required_facts' from source: task vars 40074 1727204607.81783: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 40074 1727204607.81795: variable 'omit' from source: magic vars 40074 1727204607.81843: variable 'omit' from source: magic vars 40074 1727204607.81883: variable 'omit' from source: magic vars 40074 1727204607.81968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204607.81972: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204607.81974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204607.81990: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204607.82003: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204607.82042: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204607.82046: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204607.82048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204607.82175: Set connection var ansible_pipelining to False 40074 1727204607.82185: Set connection var ansible_shell_executable to /bin/sh 40074 1727204607.82188: Set connection var ansible_shell_type to sh 40074 1727204607.82192: Set connection var ansible_connection to ssh 40074 1727204607.82235: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204607.82240: Set connection var ansible_timeout to 10 40074 1727204607.82243: variable 'ansible_shell_executable' from source: unknown 40074 1727204607.82247: variable 'ansible_connection' from source: unknown 40074 1727204607.82250: variable 'ansible_module_compression' from source: unknown 40074 1727204607.82253: variable 'ansible_shell_type' from source: unknown 40074 1727204607.82256: variable 'ansible_shell_executable' from source: unknown 40074 1727204607.82259: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204607.82295: variable 'ansible_pipelining' from source: unknown 40074 1727204607.82300: variable 'ansible_timeout' from source: unknown 40074 1727204607.82303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204607.82426: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204607.82449: variable 'omit' from source: magic vars 40074 1727204607.82453: starting attempt loop 40074 1727204607.82456: running the handler 40074 1727204607.82458: _low_level_execute_command(): starting 40074 1727204607.82516: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204607.83209: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204607.83229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204607.83294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204607.83366: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204607.83383: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204607.83472: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 40074 1727204607.85965: stdout chunk (state=3): >>>/root <<< 40074 1727204607.86249: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204607.86253: stdout chunk (state=3): >>><<< 40074 1727204607.86255: stderr chunk (state=3): >>><<< 40074 1727204607.86276: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 40074 1727204607.86359: _low_level_execute_command(): starting 40074 1727204607.86364: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097 `" && echo ansible-tmp-1727204607.8629277-40167-160850135173097="` echo /root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097 `" ) && sleep 0' 40074 1727204607.87147: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204607.87242: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204607.87270: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204607.87301: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204607.87314: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204607.87345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204607.87435: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 40074 1727204607.90454: stdout chunk (state=3): >>>ansible-tmp-1727204607.8629277-40167-160850135173097=/root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097 <<< 40074 1727204607.90684: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204607.90692: stderr chunk (state=3): >>><<< 40074 1727204607.90698: stdout chunk (state=3): >>><<< 40074 1727204607.90733: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204607.8629277-40167-160850135173097=/root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 40074 1727204607.90760: variable 'ansible_module_compression' from source: unknown 40074 1727204607.90807: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 40074 1727204607.90857: variable 'ansible_facts' from source: unknown 40074 1727204607.90975: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097/AnsiballZ_setup.py 40074 1727204607.91095: Sending initial data 40074 1727204607.91106: Sent initial data (154 bytes) 40074 1727204607.91809: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204607.91858: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204607.91871: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204607.91881: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204607.91963: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 40074 1727204607.94501: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204607.94556: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204607.94614: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpb9r2gsa5 /root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097/AnsiballZ_setup.py <<< 40074 1727204607.94618: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097/AnsiballZ_setup.py" <<< 40074 1727204607.94682: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpb9r2gsa5" to remote "/root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097/AnsiballZ_setup.py" <<< 40074 1727204607.96914: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204607.96992: stderr chunk (state=3): >>><<< 40074 1727204607.97102: stdout chunk (state=3): >>><<< 40074 1727204607.97105: done transferring module to remote 40074 1727204607.97108: _low_level_execute_command(): starting 40074 1727204607.97111: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097/ /root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097/AnsiballZ_setup.py && sleep 0' 40074 1727204607.97478: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204607.97494: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 40074 1727204607.97511: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204607.97567: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204607.97585: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204607.97627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 40074 1727204608.00361: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204608.00416: stderr chunk (state=3): >>><<< 40074 1727204608.00419: stdout chunk (state=3): >>><<< 40074 1727204608.00437: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 40074 1727204608.00441: _low_level_execute_command(): starting 40074 1727204608.00447: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097/AnsiballZ_setup.py && sleep 0' 40074 1727204608.00910: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204608.00913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204608.00916: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204608.00918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204608.00984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204608.00987: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204608.01039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 40074 1727204608.04347: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 40074 1727204608.04413: stdout chunk (state=3): >>>import _imp # builtin <<< 40074 1727204608.04460: stdout chunk (state=3): >>>import '_thread' # <<< 40074 1727204608.04468: stdout chunk (state=3): >>> <<< 40074 1727204608.04487: stdout chunk (state=3): >>>import '_warnings' # <<< 40074 1727204608.04498: stdout chunk (state=3): >>> import '_weakref' # <<< 40074 1727204608.04606: stdout chunk (state=3): >>> import '_io' # <<< 40074 1727204608.04623: stdout chunk (state=3): >>> <<< 40074 1727204608.04633: stdout chunk (state=3): >>>import 'marshal' # <<< 40074 1727204608.04636: stdout chunk (state=3): >>> <<< 40074 1727204608.04708: stdout chunk (state=3): >>>import 'posix' # <<< 40074 1727204608.04710: stdout chunk (state=3): >>> <<< 40074 1727204608.04760: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 40074 1727204608.04771: stdout chunk (state=3): >>> <<< 40074 1727204608.04794: stdout chunk (state=3): >>># installing zipimport hook <<< 40074 1727204608.04824: stdout chunk (state=3): >>>import 'time' # <<< 40074 1727204608.04848: stdout chunk (state=3): >>> import 'zipimport' # <<< 40074 1727204608.04854: stdout chunk (state=3): >>> # installed zipimport hook<<< 40074 1727204608.04895: stdout chunk (state=3): >>> <<< 40074 1727204608.04946: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py<<< 40074 1727204608.04967: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc'<<< 40074 1727204608.05013: stdout chunk (state=3): >>> import '_codecs' # <<< 40074 1727204608.05016: stdout chunk (state=3): >>> <<< 40074 1727204608.05058: stdout chunk (state=3): >>>import 'codecs' # <<< 40074 1727204608.05063: stdout chunk (state=3): >>> <<< 40074 1727204608.05158: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc'<<< 40074 1727204608.05164: stdout chunk (state=3): >>> <<< 40074 1727204608.05181: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00060c4d0><<< 40074 1727204608.05205: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0005dbad0><<< 40074 1727204608.05250: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py<<< 40074 1727204608.05260: stdout chunk (state=3): >>> <<< 40074 1727204608.05265: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 40074 1727204608.05293: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00060ea20><<< 40074 1727204608.05299: stdout chunk (state=3): >>> <<< 40074 1727204608.05362: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 40074 1727204608.05369: stdout chunk (state=3): >>> <<< 40074 1727204608.05393: stdout chunk (state=3): >>>import 'abc' # <<< 40074 1727204608.05398: stdout chunk (state=3): >>> <<< 40074 1727204608.05469: stdout chunk (state=3): >>>import 'io' # import '_stat' # <<< 40074 1727204608.05535: stdout chunk (state=3): >>> <<< 40074 1727204608.05556: stdout chunk (state=3): >>>import 'stat' # <<< 40074 1727204608.05559: stdout chunk (state=3): >>> <<< 40074 1727204608.05699: stdout chunk (state=3): >>>import '_collections_abc' # <<< 40074 1727204608.05747: stdout chunk (state=3): >>> import 'genericpath' # <<< 40074 1727204608.05760: stdout chunk (state=3): >>> <<< 40074 1727204608.05775: stdout chunk (state=3): >>>import 'posixpath' # <<< 40074 1727204608.05784: stdout chunk (state=3): >>> <<< 40074 1727204608.05857: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 40074 1727204608.05864: stdout chunk (state=3): >>> <<< 40074 1727204608.05901: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages<<< 40074 1727204608.05918: stdout chunk (state=3): >>> Adding directory: '/usr/local/lib/python3.12/site-packages'<<< 40074 1727204608.05937: stdout chunk (state=3): >>> Adding directory: '/usr/lib64/python3.12/site-packages'<<< 40074 1727204608.05957: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages'<<< 40074 1727204608.05979: stdout chunk (state=3): >>> Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth'<<< 40074 1727204608.06069: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 40074 1727204608.06096: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 40074 1727204608.06142: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004210a0><<< 40074 1727204608.06147: stdout chunk (state=3): >>> <<< 40074 1727204608.06237: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 40074 1727204608.06261: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204608.06292: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000421fd0> <<< 40074 1727204608.06378: stdout chunk (state=3): >>>import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux<<< 40074 1727204608.06397: stdout chunk (state=3): >>> <<< 40074 1727204608.06403: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information. <<< 40074 1727204608.07092: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc'<<< 40074 1727204608.07103: stdout chunk (state=3): >>> <<< 40074 1727204608.07148: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc'<<< 40074 1727204608.07152: stdout chunk (state=3): >>> <<< 40074 1727204608.07252: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 40074 1727204608.07258: stdout chunk (state=3): >>> <<< 40074 1727204608.07295: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py<<< 40074 1727204608.07300: stdout chunk (state=3): >>> <<< 40074 1727204608.07343: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 40074 1727204608.07349: stdout chunk (state=3): >>> <<< 40074 1727204608.07379: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00045fe90><<< 40074 1727204608.07384: stdout chunk (state=3): >>> <<< 40074 1727204608.07447: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 40074 1727204608.07453: stdout chunk (state=3): >>> <<< 40074 1727204608.07490: stdout chunk (state=3): >>>import '_operator' # <<< 40074 1727204608.07513: stdout chunk (state=3): >>> import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00045ff50><<< 40074 1727204608.07550: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py<<< 40074 1727204608.07555: stdout chunk (state=3): >>> <<< 40074 1727204608.07663: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 40074 1727204608.07785: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # <<< 40074 1727204608.07792: stdout chunk (state=3): >>> <<< 40074 1727204608.07824: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py<<< 40074 1727204608.07851: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000497890><<< 40074 1727204608.07881: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 40074 1727204608.07907: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc'<<< 40074 1727204608.07925: stdout chunk (state=3): >>> import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000497f20><<< 40074 1727204608.08206: stdout chunk (state=3): >>> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000477b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000475280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00045d040> <<< 40074 1727204608.08260: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py<<< 40074 1727204608.08265: stdout chunk (state=3): >>> <<< 40074 1727204608.08305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 40074 1727204608.08342: stdout chunk (state=3): >>>import '_sre' # <<< 40074 1727204608.08348: stdout chunk (state=3): >>> <<< 40074 1727204608.08397: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py<<< 40074 1727204608.08400: stdout chunk (state=3): >>> <<< 40074 1727204608.08442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 40074 1727204608.08476: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py<<< 40074 1727204608.08513: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc'<<< 40074 1727204608.08515: stdout chunk (state=3): >>> <<< 40074 1727204608.08586: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004bb7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ba3f0><<< 40074 1727204608.08595: stdout chunk (state=3): >>> <<< 40074 1727204608.08631: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py<<< 40074 1727204608.08656: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000476270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004b8b00><<< 40074 1727204608.08743: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py<<< 40074 1727204608.08751: stdout chunk (state=3): >>> <<< 40074 1727204608.08767: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc'<<< 40074 1727204608.08775: stdout chunk (state=3): >>> <<< 40074 1727204608.08799: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ec770> <<< 40074 1727204608.08816: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00045c2c0><<< 40074 1727204608.08821: stdout chunk (state=3): >>> <<< 40074 1727204608.08849: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py<<< 40074 1727204608.08904: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.08933: stdout chunk (state=3): >>># extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 40074 1727204608.08942: stdout chunk (state=3): >>> import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0004ecc20><<< 40074 1727204608.08960: stdout chunk (state=3): >>> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ecad0> <<< 40074 1727204608.09007: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 40074 1727204608.09036: stdout chunk (state=3): >>> import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0004ecec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00045ade0><<< 40074 1727204608.09080: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py<<< 40074 1727204608.09085: stdout chunk (state=3): >>> <<< 40074 1727204608.09103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc'<<< 40074 1727204608.09148: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 40074 1727204608.09155: stdout chunk (state=3): >>> <<< 40074 1727204608.09204: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc'<<< 40074 1727204608.09235: stdout chunk (state=3): >>> import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ed5b0><<< 40074 1727204608.09238: stdout chunk (state=3): >>> <<< 40074 1727204608.09276: stdout chunk (state=3): >>>import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ed280> import 'importlib.machinery' # <<< 40074 1727204608.09279: stdout chunk (state=3): >>> <<< 40074 1727204608.09351: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ee4b0><<< 40074 1727204608.09357: stdout chunk (state=3): >>> <<< 40074 1727204608.09383: stdout chunk (state=3): >>>import 'importlib.util' # <<< 40074 1727204608.09407: stdout chunk (state=3): >>> import 'runpy' # <<< 40074 1727204608.09452: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py<<< 40074 1727204608.09458: stdout chunk (state=3): >>> <<< 40074 1727204608.09521: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc'<<< 40074 1727204608.09526: stdout chunk (state=3): >>> <<< 40074 1727204608.09554: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py<<< 40074 1727204608.09575: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc'<<< 40074 1727204608.09612: stdout chunk (state=3): >>> import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0005086e0> import 'errno' # <<< 40074 1727204608.09627: stdout chunk (state=3): >>> <<< 40074 1727204608.09656: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 40074 1727204608.09681: stdout chunk (state=3): >>> # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 40074 1727204608.09726: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000509e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 40074 1727204608.09732: stdout chunk (state=3): >>> <<< 40074 1727204608.09749: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc'<<< 40074 1727204608.09785: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py<<< 40074 1727204608.09792: stdout chunk (state=3): >>> <<< 40074 1727204608.09832: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00050acf0> <<< 40074 1727204608.09880: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 40074 1727204608.09907: stdout chunk (state=3): >>> import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc00050b350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00050a270><<< 40074 1727204608.09943: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 40074 1727204608.09970: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 40074 1727204608.10029: stdout chunk (state=3): >>> # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so'<<< 40074 1727204608.10050: stdout chunk (state=3): >>> <<< 40074 1727204608.10072: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.10076: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc00050bdd0> <<< 40074 1727204608.10161: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00050b500> <<< 40074 1727204608.10259: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ee510> <<< 40074 1727204608.10264: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 40074 1727204608.10308: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 40074 1727204608.10347: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.10395: stdout chunk (state=3): >>># extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc00025fcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 40074 1727204608.10404: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0002887d0> <<< 40074 1727204608.10466: stdout chunk (state=3): >>>import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000288560> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000288800> <<< 40074 1727204608.10488: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0002889e0> <<< 40074 1727204608.10494: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00025de50> <<< 40074 1727204608.10515: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 40074 1727204608.10631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 40074 1727204608.10669: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 40074 1727204608.10672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 40074 1727204608.10706: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00028a000> <<< 40074 1727204608.10709: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000288c80> <<< 40074 1727204608.10742: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004eec00> <<< 40074 1727204608.10746: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 40074 1727204608.10810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204608.10823: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 40074 1727204608.10873: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 40074 1727204608.10899: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0002b6390> <<< 40074 1727204608.10948: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 40074 1727204608.10959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204608.10991: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 40074 1727204608.11012: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 40074 1727204608.11061: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0002ce510> <<< 40074 1727204608.11073: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 40074 1727204608.11124: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 40074 1727204608.11176: stdout chunk (state=3): >>>import 'ntpath' # <<< 40074 1727204608.11205: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00030b290> <<< 40074 1727204608.11234: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 40074 1727204608.11264: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 40074 1727204608.11292: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 40074 1727204608.11356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 40074 1727204608.11448: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00032da30> <<< 40074 1727204608.11872: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00030b3b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0002cf1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0001203e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0002cd550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00028af30> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 40074 1727204608.11878: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc0002cd910> <<< 40074 1727204608.12163: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_neopkmhb/ansible_setup_payload.zip' <<< 40074 1727204608.12170: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.12441: stdout chunk (state=3): >>># zipimport: zlib available<<< 40074 1727204608.12502: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 40074 1727204608.12506: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc'<<< 40074 1727204608.12520: stdout chunk (state=3): >>> <<< 40074 1727204608.12584: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py<<< 40074 1727204608.12602: stdout chunk (state=3): >>> <<< 40074 1727204608.12734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc'<<< 40074 1727204608.12741: stdout chunk (state=3): >>> <<< 40074 1727204608.12781: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py<<< 40074 1727204608.12798: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc'<<< 40074 1727204608.12831: stdout chunk (state=3): >>> import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00018e0f0> import '_typing' # <<< 40074 1727204608.12839: stdout chunk (state=3): >>> <<< 40074 1727204608.13179: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000164fe0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000164170><<< 40074 1727204608.13208: stdout chunk (state=3): >>> # zipimport: zlib available<<< 40074 1727204608.13214: stdout chunk (state=3): >>> <<< 40074 1727204608.13271: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available<<< 40074 1727204608.13312: stdout chunk (state=3): >>> # zipimport: zlib available<<< 40074 1727204608.13317: stdout chunk (state=3): >>> <<< 40074 1727204608.13341: stdout chunk (state=3): >>># zipimport: zlib available<<< 40074 1727204608.13369: stdout chunk (state=3): >>> import 'ansible.module_utils' # <<< 40074 1727204608.13374: stdout chunk (state=3): >>> <<< 40074 1727204608.13593: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.15557: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.17879: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' <<< 40074 1727204608.17895: stdout chunk (state=3): >>>import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000167f80> <<< 40074 1727204608.17949: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 40074 1727204608.17974: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 40074 1727204608.17987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 40074 1727204608.18011: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0001bdb20> <<< 40074 1727204608.18056: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0001bd8b0> <<< 40074 1727204608.18096: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0001bd1c0> <<< 40074 1727204608.18117: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py <<< 40074 1727204608.18131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 40074 1727204608.18168: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0001bd910> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00018ed80> <<< 40074 1727204608.18201: stdout chunk (state=3): >>>import 'atexit' # <<< 40074 1727204608.18217: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0001be8a0> <<< 40074 1727204608.18242: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.18255: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0001beae0> <<< 40074 1727204608.18277: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 40074 1727204608.18328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 40074 1727204608.18345: stdout chunk (state=3): >>>import '_locale' # <<< 40074 1727204608.18406: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0001befc0> <<< 40074 1727204608.18429: stdout chunk (state=3): >>>import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 40074 1727204608.18453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 40074 1727204608.18520: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000024da0> <<< 40074 1727204608.18544: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0000269c0> <<< 40074 1727204608.18563: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 40074 1727204608.18579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 40074 1727204608.18618: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000027260> <<< 40074 1727204608.18651: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 40074 1727204608.18693: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 40074 1727204608.18698: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000028440> <<< 40074 1727204608.18709: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 40074 1727204608.18770: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 40074 1727204608.18783: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 40074 1727204608.18834: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00002af00> <<< 40074 1727204608.18878: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc00002b020> <<< 40074 1727204608.18903: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0000291c0> <<< 40074 1727204608.18923: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 40074 1727204608.18952: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 40074 1727204608.18986: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 40074 1727204608.19008: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 40074 1727204608.19035: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 40074 1727204608.19062: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 40074 1727204608.19095: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00002ef30> <<< 40074 1727204608.19112: stdout chunk (state=3): >>>import '_tokenize' # <<< 40074 1727204608.19174: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00002da00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00002d790> <<< 40074 1727204608.19217: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 40074 1727204608.19232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 40074 1727204608.19293: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00002fec0> <<< 40074 1727204608.19329: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0000296d0> <<< 40074 1727204608.19352: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000073080> <<< 40074 1727204608.19386: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000073200> <<< 40074 1727204608.19415: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 40074 1727204608.19448: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 40074 1727204608.19470: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 40074 1727204608.19506: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.19527: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000078dd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000078b90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 40074 1727204608.19675: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 40074 1727204608.19723: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.19735: stdout chunk (state=3): >>># extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc00007b320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0000794c0> <<< 40074 1727204608.19763: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 40074 1727204608.19810: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204608.19851: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 40074 1727204608.19854: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 40074 1727204608.19874: stdout chunk (state=3): >>>import '_string' # <<< 40074 1727204608.19913: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000082b10> <<< 40074 1727204608.20073: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00007b4a0> <<< 40074 1727204608.20154: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000083e00> <<< 40074 1727204608.20196: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000083800> <<< 40074 1727204608.20249: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0000834d0> <<< 40074 1727204608.20267: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000073500> <<< 40074 1727204608.20295: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 40074 1727204608.20341: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 40074 1727204608.20353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 40074 1727204608.20383: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.20415: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000086ab0> <<< 40074 1727204608.20614: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.20649: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000087f50> <<< 40074 1727204608.20654: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000085250> <<< 40074 1727204608.20708: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000086600> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000084e00> <<< 40074 1727204608.20711: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.20746: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 40074 1727204608.20749: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.20856: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.20965: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.20981: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 40074 1727204608.21011: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.21038: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 40074 1727204608.21050: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.21182: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.21326: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.22029: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.22730: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 40074 1727204608.22734: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # <<< 40074 1727204608.22787: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 40074 1727204608.22793: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 40074 1727204608.22809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204608.22859: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.22873: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbffff102c0> <<< 40074 1727204608.22974: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 40074 1727204608.23020: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff111f0> <<< 40074 1727204608.23032: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00008b650> <<< 40074 1727204608.23066: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 40074 1727204608.23090: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.23122: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.23146: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 40074 1727204608.23157: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.23319: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.23508: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 40074 1727204608.23541: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00008bdd0> <<< 40074 1727204608.23545: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.24120: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.24673: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.24757: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.24856: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 40074 1727204608.24862: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.24909: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.24958: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 40074 1727204608.24965: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.25042: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.25169: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 40074 1727204608.25174: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.25217: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 40074 1727204608.25222: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.25257: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.25308: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 40074 1727204608.25327: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.25604: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.25902: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 40074 1727204608.25976: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 40074 1727204608.26002: stdout chunk (state=3): >>>import '_ast' # <<< 40074 1727204608.26083: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff13620> <<< 40074 1727204608.26105: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.26178: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.26262: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 40074 1727204608.26290: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 40074 1727204608.26322: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 40074 1727204608.26346: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 40074 1727204608.26419: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.26565: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbffff19bb0> <<< 40074 1727204608.26620: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbffff1a510> <<< 40074 1727204608.26624: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff12540> <<< 40074 1727204608.26644: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.26688: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.26740: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 40074 1727204608.26744: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.26795: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.26837: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.26905: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.26975: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 40074 1727204608.27029: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204608.27123: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbffff19280> <<< 40074 1727204608.27170: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff1a720> <<< 40074 1727204608.27207: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 40074 1727204608.27231: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.27288: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.27359: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.27391: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.27445: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204608.27485: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 40074 1727204608.27496: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 40074 1727204608.27507: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 40074 1727204608.27578: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 40074 1727204608.27614: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 40074 1727204608.27617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 40074 1727204608.27679: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffffae8d0> <<< 40074 1727204608.27725: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff24500> <<< 40074 1727204608.27818: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff22720> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff224e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 40074 1727204608.27844: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.27866: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.27904: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 40074 1727204608.27974: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 40074 1727204608.27977: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.28013: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 40074 1727204608.28077: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.28154: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.28166: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.28188: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.28244: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.28281: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.28327: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.28379: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 40074 1727204608.28383: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.28464: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.28541: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.28568: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.28607: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 40074 1727204608.28619: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.28821: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.29017: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.29057: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.29121: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204608.29154: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 40074 1727204608.29180: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 40074 1727204608.29183: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 40074 1727204608.29237: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 40074 1727204608.29250: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffffb1040> <<< 40074 1727204608.29271: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 40074 1727204608.29292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 40074 1727204608.29305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 40074 1727204608.29353: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 40074 1727204608.29374: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 40074 1727204608.29398: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 40074 1727204608.29410: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff507fe0> <<< 40074 1727204608.29446: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.29466: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff50c290> <<< 40074 1727204608.29507: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff89070> <<< 40074 1727204608.29526: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff88350> <<< 40074 1727204608.29566: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffffb3260> <<< 40074 1727204608.29592: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffffb2ba0> <<< 40074 1727204608.29605: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 40074 1727204608.29657: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 40074 1727204608.29678: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 40074 1727204608.29714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 40074 1727204608.29748: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.29772: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff50f320> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff50ebd0> <<< 40074 1727204608.29807: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff50edb0> <<< 40074 1727204608.29838: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff50e000> <<< 40074 1727204608.29841: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 40074 1727204608.29959: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 40074 1727204608.29998: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff50f500> <<< 40074 1727204608.30003: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 40074 1727204608.30030: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 40074 1727204608.30060: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff57a000> <<< 40074 1727204608.30100: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff50ffe0> <<< 40074 1727204608.30136: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffffb2d50> import 'ansible.module_utils.facts.timeout' # <<< 40074 1727204608.30170: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # # zipimport: zlib available <<< 40074 1727204608.30187: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 40074 1727204608.30214: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.30267: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.30337: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 40074 1727204608.30355: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.30406: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.30470: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 40074 1727204608.30501: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 40074 1727204608.30519: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.30546: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.30582: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 40074 1727204608.30606: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.30642: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.30712: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 40074 1727204608.30716: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.30753: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.30803: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 40074 1727204608.30821: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.30871: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.30940: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.31003: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.31075: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 40074 1727204608.31087: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 40074 1727204608.31650: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32149: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 40074 1727204608.32167: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32213: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32284: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32316: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32361: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 40074 1727204608.32378: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32408: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32441: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 40074 1727204608.32461: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32515: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32588: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 40074 1727204608.32593: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32626: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32656: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 40074 1727204608.32677: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32701: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32747: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 40074 1727204608.32751: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32836: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.32927: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 40074 1727204608.32951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 40074 1727204608.32977: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff57b470> <<< 40074 1727204608.33009: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 40074 1727204608.33021: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 40074 1727204608.33152: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff57abd0> import 'ansible.module_utils.facts.system.local' # <<< 40074 1727204608.33183: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.33246: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.33331: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 40074 1727204608.33335: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.33429: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.33553: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 40074 1727204608.33557: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.33612: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.33706: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available <<< 40074 1727204608.33759: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.33808: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 40074 1727204608.33860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 40074 1727204608.33935: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.34010: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff5aa0f0> <<< 40074 1727204608.34225: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff1a480> <<< 40074 1727204608.34241: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # <<< 40074 1727204608.34562: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available <<< 40074 1727204608.34688: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.34895: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.35149: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 40074 1727204608.35170: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.35239: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.35300: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 40074 1727204608.35311: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.35367: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.35447: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 40074 1727204608.35495: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204608.35521: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff39dd90> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff39d9d0> import 'ansible.module_utils.facts.system.user' # <<< 40074 1727204608.35549: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.35573: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.35592: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware' # <<< 40074 1727204608.35601: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.35726: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available <<< 40074 1727204608.36020: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.36293: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 40074 1727204608.36311: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.36484: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.36694: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.36727: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.36798: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # <<< 40074 1727204608.36848: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 40074 1727204608.36876: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.37110: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.37451: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 40074 1727204608.37727: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # <<< 40074 1727204608.37744: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 40074 1727204608.38450: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.39260: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 40074 1727204608.39313: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.39468: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.39655: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 40074 1727204608.39671: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.39846: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.40021: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 40074 1727204608.40040: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.40320: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.40601: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 40074 1727204608.40608: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.40630: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 40074 1727204608.40657: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.40721: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.40789: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 40074 1727204608.40805: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.40975: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.41148: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.41523: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.41771: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 40074 1727204608.41799: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.41822: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.41862: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 40074 1727204608.41890: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.41926: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # <<< 40074 1727204608.41943: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.42035: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.42110: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 40074 1727204608.42115: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.42155: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.42159: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 40074 1727204608.42184: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.42560: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 40074 1727204608.43033: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.43525: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 40074 1727204608.43540: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.43629: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.43723: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 40074 1727204608.43740: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.43802: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.43842: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 40074 1727204608.43873: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.43952: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # <<< 40074 1727204608.43973: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.44067: stdout chunk (state=3): >>># zipimport: zlib available<<< 40074 1727204608.44074: stdout chunk (state=3): >>> <<< 40074 1727204608.44137: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available<<< 40074 1727204608.44144: stdout chunk (state=3): >>> <<< 40074 1727204608.44411: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # <<< 40074 1727204608.44425: stdout chunk (state=3): >>> <<< 40074 1727204608.44449: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.44465: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.44479: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 40074 1727204608.44487: stdout chunk (state=3): >>> <<< 40074 1727204608.44517: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.44591: stdout chunk (state=3): >>># zipimport: zlib available<<< 40074 1727204608.44598: stdout chunk (state=3): >>> <<< 40074 1727204608.44666: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 40074 1727204608.44673: stdout chunk (state=3): >>> <<< 40074 1727204608.44805: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 40074 1727204608.44847: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.44936: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.45071: stdout chunk (state=3): >>># zipimport: zlib available<<< 40074 1727204608.45075: stdout chunk (state=3): >>> <<< 40074 1727204608.45201: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 40074 1727204608.45215: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # <<< 40074 1727204608.45227: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 40074 1727204608.45255: stdout chunk (state=3): >>> # zipimport: zlib available<<< 40074 1727204608.45261: stdout chunk (state=3): >>> <<< 40074 1727204608.45343: stdout chunk (state=3): >>># zipimport: zlib available<<< 40074 1727204608.45426: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.virtual.hpux' # <<< 40074 1727204608.45454: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.45843: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.46208: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 40074 1727204608.46232: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.46316: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.46396: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 40074 1727204608.46420: stdout chunk (state=3): >>> # zipimport: zlib available<<< 40074 1727204608.46426: stdout chunk (state=3): >>> <<< 40074 1727204608.46499: stdout chunk (state=3): >>># zipimport: zlib available<<< 40074 1727204608.46581: stdout chunk (state=3): >>> import 'ansible.module_utils.facts.virtual.openbsd' # <<< 40074 1727204608.46606: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.46747: stdout chunk (state=3): >>># zipimport: zlib available<<< 40074 1727204608.46753: stdout chunk (state=3): >>> <<< 40074 1727204608.46895: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 40074 1727204608.46913: stdout chunk (state=3): >>> <<< 40074 1727204608.46920: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 40074 1727204608.46940: stdout chunk (state=3): >>># zipimport: zlib available<<< 40074 1727204608.47091: stdout chunk (state=3): >>> # zipimport: zlib available <<< 40074 1727204608.47254: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 40074 1727204608.47274: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # <<< 40074 1727204608.47282: stdout chunk (state=3): >>> import 'ansible.module_utils.facts' # <<< 40074 1727204608.47406: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204608.48987: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 40074 1727204608.49026: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 40074 1727204608.49051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 40074 1727204608.49098: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff3c7770> <<< 40074 1727204608.49128: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff3c6480> <<< 40074 1727204608.49216: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff3c5430> <<< 40074 1727204608.49741: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_local": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "28", "epoch": "1727204608", "epoch_int": "1727204608", "date": "2024-09-24", "time": "15:03:28", "iso8601_micro": "2024-09-24T19:03:28.488308Z", "iso8601": "2024-09-24T19:03:28Z", "iso8601_basic": "20240924T150328488308", "iso8601_basic_short": "20240924T150328", "tz": "EDT", "tz_dst": "EDT", "<<< 40074 1727204608.49751: stdout chunk (state=3): >>>tz_offset": "-0400"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 40074 1727204608.50765: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder #<<< 40074 1727204608.50891: stdout chunk (state=3): >>> cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime <<< 40074 1727204608.51125: stdout chunk (state=3): >>># cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heap<<< 40074 1727204608.51133: stdout chunk (state=3): >>>q # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ans<<< 40074 1727204608.51157: stdout chunk (state=3): >>>ible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 40074 1727204608.51434: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 40074 1727204608.51449: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 40074 1727204608.51476: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 40074 1727204608.51502: stdout chunk (state=3): >>># destroy binascii # destroy zlib # destroy bz2 # destroy lzma <<< 40074 1727204608.51518: stdout chunk (state=3): >>># destroy zipfile._path <<< 40074 1727204608.51534: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 40074 1727204608.51580: stdout chunk (state=3): >>># destroy ntpath <<< 40074 1727204608.51604: stdout chunk (state=3): >>># destroy importlib <<< 40074 1727204608.51611: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib <<< 40074 1727204608.51696: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 40074 1727204608.51724: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil <<< 40074 1727204608.51749: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 40074 1727204608.51815: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 40074 1727204608.51830: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy multiprocessing.context <<< 40074 1727204608.51846: stdout chunk (state=3): >>># destroy array # destroy _compat_pickle <<< 40074 1727204608.51864: stdout chunk (state=3): >>># destroy _pickle <<< 40074 1727204608.51875: stdout chunk (state=3): >>># destroy queue # destroy _heapq <<< 40074 1727204608.51895: stdout chunk (state=3): >>># destroy _queue # destroy multiprocessing.process <<< 40074 1727204608.51902: stdout chunk (state=3): >>># destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing <<< 40074 1727204608.51934: stdout chunk (state=3): >>># destroy shlex # destroy fcntl # destroy datetime <<< 40074 1727204608.51952: stdout chunk (state=3): >>># destroy subprocess # destroy base64 <<< 40074 1727204608.52006: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux <<< 40074 1727204608.52019: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios <<< 40074 1727204608.52024: stdout chunk (state=3): >>># destroy errno # destroy json <<< 40074 1727204608.52065: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector <<< 40074 1727204608.52147: stdout chunk (state=3): >>># cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian <<< 40074 1727204608.52173: stdout chunk (state=3): >>># cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback <<< 40074 1727204608.52184: stdout chunk (state=3): >>># destroy linecache # destroy textwrap # cleanup[3] wiping tokenize <<< 40074 1727204608.52202: stdout chunk (state=3): >>># cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 <<< 40074 1727204608.52232: stdout chunk (state=3): >>># cleanup[3] wiping contextlib # cleanup[3] wiping threading <<< 40074 1727204608.52236: stdout chunk (state=3): >>># cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external <<< 40074 1727204608.52239: stdout chunk (state=3): >>># cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re <<< 40074 1727204608.52257: stdout chunk (state=3): >>># destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg <<< 40074 1727204608.52265: stdout chunk (state=3): >>># cleanup[3] wiping re._parser <<< 40074 1727204608.52283: stdout chunk (state=3): >>># cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 40074 1727204608.52297: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types <<< 40074 1727204608.52323: stdout chunk (state=3): >>># cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 40074 1727204608.52337: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 40074 1727204608.52359: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 40074 1727204608.52599: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 40074 1727204608.52661: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 40074 1727204608.52666: stdout chunk (state=3): >>># destroy tokenize <<< 40074 1727204608.52746: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize <<< 40074 1727204608.52765: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser <<< 40074 1727204608.52771: stdout chunk (state=3): >>># destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 40074 1727204608.52821: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 40074 1727204608.52949: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections <<< 40074 1727204608.52956: stdout chunk (state=3): >>># destroy threading <<< 40074 1727204608.52974: stdout chunk (state=3): >>># destroy atexit # destroy _warnings # destroy math # destroy _bisect <<< 40074 1727204608.53010: stdout chunk (state=3): >>># destroy time # destroy _random # destroy _weakref <<< 40074 1727204608.53051: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _sre <<< 40074 1727204608.53058: stdout chunk (state=3): >>># destroy _string # destroy re <<< 40074 1727204608.53083: stdout chunk (state=3): >>># destroy itertools <<< 40074 1727204608.53094: stdout chunk (state=3): >>># destroy _abc # destroy posix # destroy _functools <<< 40074 1727204608.53196: stdout chunk (state=3): >>># destroy builtins # destroy _thread # clear sys.audit hooks <<< 40074 1727204608.53775: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204608.53846: stderr chunk (state=3): >>><<< 40074 1727204608.53850: stdout chunk (state=3): >>><<< 40074 1727204608.53966: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00060c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0005dbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00060ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004210a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000421fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00045fe90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00045ff50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000497890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000497f20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000477b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000475280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00045d040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004bb7d0> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ba3f0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000476270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004b8b00> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ec770> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00045c2c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0004ecc20> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ecad0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0004ecec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00045ade0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ed5b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ed280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ee4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0005086e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000509e20> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00050acf0> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc00050b350> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00050a270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc00050bdd0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00050b500> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004ee510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc00025fcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0002887d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000288560> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000288800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0002889e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00025de50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00028a000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000288c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0004eec00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0002b6390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0002ce510> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00030b290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00032da30> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00030b3b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0002cf1a0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0001203e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0002cd550> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00028af30> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fc0002cd910> # zipimport: found 103 names in '/tmp/ansible_setup_payload_neopkmhb/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00018e0f0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000164fe0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000164170> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000167f80> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0001bdb20> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0001bd8b0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0001bd1c0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0001bd910> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00018ed80> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0001be8a0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0001beae0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0001befc0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000024da0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0000269c0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000027260> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000028440> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00002af00> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc00002b020> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0000291c0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00002ef30> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00002da00> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00002d790> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00002fec0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0000296d0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000073080> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000073200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000078dd0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000078b90> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc00007b320> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc0000794c0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000082b10> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00007b4a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000083e00> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000083800> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc0000834d0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000073500> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000086ab0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000087f50> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000085250> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fc000086600> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc000084e00> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbffff102c0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff111f0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00008b650> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fc00008bdd0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff13620> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbffff19bb0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbffff1a510> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff12540> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbffff19280> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff1a720> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffffae8d0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff24500> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff22720> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff224e0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffffb1040> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff507fe0> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff50c290> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff89070> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff88350> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffffb3260> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffffb2ba0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff50f320> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff50ebd0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff50edb0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff50e000> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff50f500> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff57a000> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff50ffe0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffffb2d50> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff57b470> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff57abd0> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff5aa0f0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbffff1a480> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff39dd90> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff39d9d0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fbfff3c7770> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff3c6480> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fbfff3c5430> {"ansible_facts": {"ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/1"}, "ansible_local": {}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "28", "epoch": "1727204608", "epoch_int": "1727204608", "date": "2024-09-24", "time": "15:03:28", "iso8601_micro": "2024-09-24T19:03:28.488308Z", "iso8601": "2024-09-24T19:03:28Z", "iso8601_basic": "20240924T150328488308", "iso8601_basic_short": "20240924T150328", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 40074 1727204608.54886: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204608.54893: _low_level_execute_command(): starting 40074 1727204608.54897: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204607.8629277-40167-160850135173097/ > /dev/null 2>&1 && sleep 0' 40074 1727204608.54900: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204608.54902: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204608.54905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204608.54907: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204608.54940: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204608.54944: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204608.54946: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204608.54948: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204608.54950: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204608.55009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204608.55013: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204608.55028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204608.55074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 40074 1727204608.57824: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204608.57870: stderr chunk (state=3): >>><<< 40074 1727204608.57876: stdout chunk (state=3): >>><<< 40074 1727204608.57895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 40074 1727204608.57904: handler run complete 40074 1727204608.57944: variable 'ansible_facts' from source: unknown 40074 1727204608.57997: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204608.58104: variable 'ansible_facts' from source: unknown 40074 1727204608.58160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204608.58214: attempt loop complete, returning result 40074 1727204608.58218: _execute() done 40074 1727204608.58221: dumping result to json 40074 1727204608.58237: done dumping result, returning 40074 1727204608.58246: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [12b410aa-8751-9fd7-2501-0000000000d0] 40074 1727204608.58250: sending task result for task 12b410aa-8751-9fd7-2501-0000000000d0 40074 1727204608.58404: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000d0 40074 1727204608.58407: WORKER PROCESS EXITING ok: [managed-node2] 40074 1727204608.58558: no more pending results, returning what we have 40074 1727204608.58561: results queue empty 40074 1727204608.58562: checking for any_errors_fatal 40074 1727204608.58563: done checking for any_errors_fatal 40074 1727204608.58564: checking for max_fail_percentage 40074 1727204608.58566: done checking for max_fail_percentage 40074 1727204608.58567: checking to see if all hosts have failed and the running result is not ok 40074 1727204608.58568: done checking to see if all hosts have failed 40074 1727204608.58569: getting the remaining hosts for this loop 40074 1727204608.58570: done getting the remaining hosts for this loop 40074 1727204608.58574: getting the next task for host managed-node2 40074 1727204608.58582: done getting next task for host managed-node2 40074 1727204608.58585: ^ task is: TASK: Check if system is ostree 40074 1727204608.58587: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204608.58600: getting variables 40074 1727204608.58602: in VariableManager get_vars() 40074 1727204608.58629: Calling all_inventory to load vars for managed-node2 40074 1727204608.58632: Calling groups_inventory to load vars for managed-node2 40074 1727204608.58635: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204608.58646: Calling all_plugins_play to load vars for managed-node2 40074 1727204608.58649: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204608.58653: Calling groups_plugins_play to load vars for managed-node2 40074 1727204608.58835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204608.59028: done with get_vars() 40074 1727204608.59041: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 15:03:28 -0400 (0:00:00.825) 0:00:02.352 ***** 40074 1727204608.59117: entering _queue_task() for managed-node2/stat 40074 1727204608.59337: worker is 1 (out of 1 available) 40074 1727204608.59352: exiting _queue_task() for managed-node2/stat 40074 1727204608.59364: done queuing things up, now waiting for results queue to drain 40074 1727204608.59365: waiting for pending results... 40074 1727204608.59612: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 40074 1727204608.59617: in run() - task 12b410aa-8751-9fd7-2501-0000000000d2 40074 1727204608.59620: variable 'ansible_search_path' from source: unknown 40074 1727204608.59622: variable 'ansible_search_path' from source: unknown 40074 1727204608.59634: calling self._execute() 40074 1727204608.59698: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204608.59704: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204608.59716: variable 'omit' from source: magic vars 40074 1727204608.60120: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204608.60335: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204608.60372: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204608.60423: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204608.60453: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204608.60528: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204608.60551: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204608.60572: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204608.60599: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204608.60703: Evaluated conditional (not __network_is_ostree is defined): True 40074 1727204608.60707: variable 'omit' from source: magic vars 40074 1727204608.60741: variable 'omit' from source: magic vars 40074 1727204608.60771: variable 'omit' from source: magic vars 40074 1727204608.60793: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204608.60825: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204608.60838: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204608.60855: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204608.60864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204608.60894: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204608.60898: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204608.60901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204608.60988: Set connection var ansible_pipelining to False 40074 1727204608.60996: Set connection var ansible_shell_executable to /bin/sh 40074 1727204608.61000: Set connection var ansible_shell_type to sh 40074 1727204608.61002: Set connection var ansible_connection to ssh 40074 1727204608.61010: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204608.61018: Set connection var ansible_timeout to 10 40074 1727204608.61041: variable 'ansible_shell_executable' from source: unknown 40074 1727204608.61044: variable 'ansible_connection' from source: unknown 40074 1727204608.61047: variable 'ansible_module_compression' from source: unknown 40074 1727204608.61052: variable 'ansible_shell_type' from source: unknown 40074 1727204608.61054: variable 'ansible_shell_executable' from source: unknown 40074 1727204608.61060: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204608.61064: variable 'ansible_pipelining' from source: unknown 40074 1727204608.61069: variable 'ansible_timeout' from source: unknown 40074 1727204608.61074: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204608.61197: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204608.61206: variable 'omit' from source: magic vars 40074 1727204608.61212: starting attempt loop 40074 1727204608.61215: running the handler 40074 1727204608.61228: _low_level_execute_command(): starting 40074 1727204608.61236: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204608.61775: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204608.61779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204608.61782: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204608.61785: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204608.61787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204608.61846: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204608.61852: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204608.61900: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 40074 1727204608.64442: stdout chunk (state=3): >>>/root <<< 40074 1727204608.64485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204608.64534: stderr chunk (state=3): >>><<< 40074 1727204608.64538: stdout chunk (state=3): >>><<< 40074 1727204608.64564: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 40074 1727204608.64575: _low_level_execute_command(): starting 40074 1727204608.64581: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111 `" && echo ansible-tmp-1727204608.6456194-40190-183069344624111="` echo /root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111 `" ) && sleep 0' 40074 1727204608.65033: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204608.65037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204608.65039: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204608.65042: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204608.65096: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204608.65099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204608.65153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 40074 1727204608.68102: stdout chunk (state=3): >>>ansible-tmp-1727204608.6456194-40190-183069344624111=/root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111 <<< 40074 1727204608.68310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204608.68365: stderr chunk (state=3): >>><<< 40074 1727204608.68369: stdout chunk (state=3): >>><<< 40074 1727204608.68389: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204608.6456194-40190-183069344624111=/root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 40074 1727204608.68438: variable 'ansible_module_compression' from source: unknown 40074 1727204608.68488: ANSIBALLZ: Using lock for stat 40074 1727204608.68494: ANSIBALLZ: Acquiring lock 40074 1727204608.68497: ANSIBALLZ: Lock acquired: 139809964201536 40074 1727204608.68499: ANSIBALLZ: Creating module 40074 1727204608.89273: ANSIBALLZ: Writing module into payload 40074 1727204608.89498: ANSIBALLZ: Writing module 40074 1727204608.89502: ANSIBALLZ: Renaming module 40074 1727204608.89505: ANSIBALLZ: Done creating module 40074 1727204608.89507: variable 'ansible_facts' from source: unknown 40074 1727204608.89509: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111/AnsiballZ_stat.py 40074 1727204608.89766: Sending initial data 40074 1727204608.89770: Sent initial data (153 bytes) 40074 1727204608.91212: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204608.91248: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204608.91321: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204608.91384: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204608.93224: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 40074 1727204608.93243: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 40074 1727204608.93268: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204608.93327: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204608.93439: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpn7roq4ut /root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111/AnsiballZ_stat.py <<< 40074 1727204608.93456: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111/AnsiballZ_stat.py" <<< 40074 1727204608.93653: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpn7roq4ut" to remote "/root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111/AnsiballZ_stat.py" <<< 40074 1727204608.95711: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204608.96072: stderr chunk (state=3): >>><<< 40074 1727204608.96081: stdout chunk (state=3): >>><<< 40074 1727204608.96084: done transferring module to remote 40074 1727204608.96087: _low_level_execute_command(): starting 40074 1727204608.96091: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111/ /root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111/AnsiballZ_stat.py && sleep 0' 40074 1727204608.97313: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204608.97329: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204608.97345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204608.97523: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204608.97538: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204608.97807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204608.99697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204608.99776: stderr chunk (state=3): >>><<< 40074 1727204608.99786: stdout chunk (state=3): >>><<< 40074 1727204608.99814: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204608.99823: _low_level_execute_command(): starting 40074 1727204608.99836: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111/AnsiballZ_stat.py && sleep 0' 40074 1727204609.01001: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204609.01310: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204609.01507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204609.01681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204609.04024: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 40074 1727204609.04069: stdout chunk (state=3): >>>import _imp # builtin <<< 40074 1727204609.04108: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # import '_weakref' # <<< 40074 1727204609.04176: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 40074 1727204609.04290: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook <<< 40074 1727204609.04336: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 40074 1727204609.04394: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204609.04403: stdout chunk (state=3): >>>import '_codecs' # <<< 40074 1727204609.04435: stdout chunk (state=3): >>>import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 40074 1727204609.04462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 40074 1727204609.04608: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31544ac4d0> <<< 40074 1727204609.04612: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315447bad0> <<< 40074 1727204609.04639: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31544aea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # <<< 40074 1727204609.04722: stdout chunk (state=3): >>>import '_collections_abc' # <<< 40074 1727204609.04752: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 40074 1727204609.04786: stdout chunk (state=3): >>>import 'os' # <<< 40074 1727204609.04803: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 40074 1727204609.04815: stdout chunk (state=3): >>>Processing user site-packages <<< 40074 1727204609.04837: stdout chunk (state=3): >>>Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' <<< 40074 1727204609.04862: stdout chunk (state=3): >>>Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 40074 1727204609.04893: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py <<< 40074 1727204609.05094: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 40074 1727204609.05098: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31544bd0a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 40074 1727204609.05101: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204609.05103: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31544bdfd0> <<< 40074 1727204609.05106: stdout chunk (state=3): >>>import 'site' # <<< 40074 1727204609.05174: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 40074 1727204609.05316: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 40074 1727204609.05400: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 40074 1727204609.05434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 40074 1727204609.05451: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 40074 1727204609.05481: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 40074 1727204609.05497: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315429be90> <<< 40074 1727204609.05534: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 40074 1727204609.05603: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315429bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 40074 1727204609.05631: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 40074 1727204609.05645: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 40074 1727204609.05697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204609.05770: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542d3860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 40074 1727204609.05994: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542d3ef0> import '_collections' # <<< 40074 1727204609.05998: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542b3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542b1280> <<< 40074 1727204609.06235: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154299040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542f7740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542f6360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542b2270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315429af30> <<< 40074 1727204609.06291: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 40074 1727204609.06305: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154328740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542982c0> <<< 40074 1727204609.06435: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 40074 1727204609.06461: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154328bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154328aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154328e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154296de0> <<< 40074 1727204609.06469: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 40074 1727204609.06553: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154329520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31543291f0> <<< 40074 1727204609.06575: stdout chunk (state=3): >>>import 'importlib.machinery' # <<< 40074 1727204609.06603: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 40074 1727204609.06620: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315432a420> <<< 40074 1727204609.06635: stdout chunk (state=3): >>>import 'importlib.util' # <<< 40074 1727204609.06656: stdout chunk (state=3): >>>import 'runpy' # <<< 40074 1727204609.06681: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 40074 1727204609.06814: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154344650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154345d60> <<< 40074 1727204609.06824: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 40074 1727204609.07095: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 40074 1727204609.07099: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154346c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31543472c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31543461b0> <<< 40074 1727204609.07102: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 40074 1727204609.07104: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 40074 1727204609.07220: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154347d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154347470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315432a480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31540cbcb0> <<< 40074 1727204609.07226: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 40074 1727204609.07250: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 40074 1727204609.07269: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31540f47a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31540f4500> <<< 40074 1727204609.07302: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31540f47d0> <<< 40074 1727204609.07364: stdout chunk (state=3): >>># extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31540f49b0> <<< 40074 1727204609.07380: stdout chunk (state=3): >>>import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31540c9e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 40074 1727204609.07488: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 40074 1727204609.07535: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 40074 1727204609.07557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31540f5fa0> <<< 40074 1727204609.07644: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31540f4c20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315432ab70> <<< 40074 1727204609.07666: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204609.07695: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 40074 1727204609.07736: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 40074 1727204609.07767: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154122360> <<< 40074 1727204609.07820: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 40074 1727204609.07842: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204609.07896: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 40074 1727204609.07918: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 40074 1727204609.07961: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315413a4b0> <<< 40074 1727204609.08021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 40074 1727204609.08025: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 40074 1727204609.08080: stdout chunk (state=3): >>>import 'ntpath' # <<< 40074 1727204609.08101: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154177290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 40074 1727204609.08179: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 40074 1727204609.08204: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 40074 1727204609.08311: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154199a30> <<< 40074 1727204609.08383: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31541773b0> <<< 40074 1727204609.08514: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315413b140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py <<< 40074 1727204609.08518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153fb43b0> <<< 40074 1727204609.08529: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31541394f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31540f6f00> <<< 40074 1727204609.08592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 40074 1727204609.08625: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3153fb4650> <<< 40074 1727204609.08693: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_gxn02oc7/ansible_stat_payload.zip' <<< 40074 1727204609.08767: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.08899: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 40074 1727204609.08959: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 40074 1727204609.09065: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315400a0f0> import '_typing' # <<< 40074 1727204609.09274: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153fe0fe0> <<< 40074 1727204609.09285: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153fe0170> # zipimport: zlib available <<< 40074 1727204609.09403: stdout chunk (state=3): >>>import 'ansible' # <<< 40074 1727204609.09407: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.09425: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 40074 1727204609.10978: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.12279: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py <<< 40074 1727204609.12355: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153fe3f50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 40074 1727204609.12382: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 40074 1727204609.12453: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 40074 1727204609.12470: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154035a90> <<< 40074 1727204609.12494: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154035820> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154035160> <<< 40074 1727204609.12516: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' <<< 40074 1727204609.12619: stdout chunk (state=3): >>>import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31540358b0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315400ab10> import 'atexit' # <<< 40074 1727204609.12623: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204609.12830: stdout chunk (state=3): >>># extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154036840> <<< 40074 1727204609.12833: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154036a80> <<< 40074 1727204609.12836: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154036fc0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 40074 1727204609.12874: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e98ce0> <<< 40074 1727204609.12900: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204609.12914: stdout chunk (state=3): >>># extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153e9a990> <<< 40074 1727204609.12943: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 40074 1727204609.12982: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e9b350> <<< 40074 1727204609.13003: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 40074 1727204609.13062: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 40074 1727204609.13065: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e9c530> <<< 40074 1727204609.13076: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 40074 1727204609.13114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 40074 1727204609.13153: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 40074 1727204609.13249: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e9efc0> <<< 40074 1727204609.13276: stdout chunk (state=3): >>># extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153e9f0e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e9d280> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 40074 1727204609.13314: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 40074 1727204609.13355: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 40074 1727204609.13466: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 40074 1727204609.13497: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ea2ea0> import '_tokenize' # <<< 40074 1727204609.13520: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ea1970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ea16d0> <<< 40074 1727204609.13535: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 40074 1727204609.13556: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 40074 1727204609.13626: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ea3b60> <<< 40074 1727204609.13661: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e9d790> <<< 40074 1727204609.13806: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153eeb0b0> <<< 40074 1727204609.13824: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153eeb230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204609.13846: stdout chunk (state=3): >>># extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153ef0e00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ef0bc0> <<< 40074 1727204609.13863: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 40074 1727204609.13991: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 40074 1727204609.14046: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153ef3380> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ef14f0> <<< 40074 1727204609.14074: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 40074 1727204609.14131: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204609.14161: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 40074 1727204609.14174: stdout chunk (state=3): >>>import '_string' # <<< 40074 1727204609.14239: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ef6ba0> <<< 40074 1727204609.14382: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ef3530> <<< 40074 1727204609.14620: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153ef7e60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153ef7860> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153ef7f20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153eeb530> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 40074 1727204609.14653: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 40074 1727204609.14682: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204609.14724: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153efb5f0> <<< 40074 1727204609.15024: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204609.15060: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153efc620> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ef9d60> <<< 40074 1727204609.15202: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153efb0e0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ef9940> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 40074 1727204609.15309: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.15472: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.15518: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 40074 1727204609.15545: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 40074 1727204609.15774: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.16002: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.17181: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.18483: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 40074 1727204609.18592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153f84830> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153f85520> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153efc6b0> <<< 40074 1727204609.18626: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 40074 1727204609.18698: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.18984: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153f854f0> <<< 40074 1727204609.19039: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.19606: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.20113: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.20418: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 40074 1727204609.20473: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.warnings' # <<< 40074 1727204609.20508: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.20622: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.20816: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 40074 1727204609.20858: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.20871: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 40074 1727204609.20916: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.20945: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.20997: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 40074 1727204609.21057: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.21515: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.22010: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 40074 1727204609.22114: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 40074 1727204609.22141: stdout chunk (state=3): >>>import '_ast' # <<< 40074 1727204609.22299: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153f86300> <<< 40074 1727204609.22389: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.22443: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.22595: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 40074 1727204609.22644: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 40074 1727204609.22659: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 40074 1727204609.22900: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 40074 1727204609.22946: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153f8e0f0> <<< 40074 1727204609.23039: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153f8ea50> <<< 40074 1727204609.23044: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153f872c0> <<< 40074 1727204609.23061: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.23116: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.23220: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 40074 1727204609.23529: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.23572: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 40074 1727204609.23620: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153f8d7f0> <<< 40074 1727204609.23663: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153f8ec60> <<< 40074 1727204609.23718: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 40074 1727204609.23784: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.23867: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.23884: stdout chunk (state=3): >>># zipimport: zlib available <<< 40074 1727204609.23933: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 40074 1727204609.24075: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 40074 1727204609.24145: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 40074 1727204609.24166: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 40074 1727204609.24192: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e22e10> <<< 40074 1727204609.24236: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153d9cb90> <<< 40074 1727204609.24333: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153d9bd70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153d9aab0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 40074 1727204609.24372: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 40074 1727204609.24462: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 40074 1727204609.24546: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available <<< 40074 1727204609.24587: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 40074 1727204609.24934: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 40074 1727204609.25414: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 40074 1727204609.25662: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 40074 1727204609.25701: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc <<< 40074 1727204609.25764: stdout chunk (state=3): >>># cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil <<< 40074 1727204609.25782: stdout chunk (state=3): >>># cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ <<< 40074 1727204609.25861: stdout chunk (state=3): >>># cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast <<< 40074 1727204609.25865: stdout chunk (state=3): >>># destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 40074 1727204609.26221: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 40074 1727204609.26224: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 40074 1727204609.26253: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma <<< 40074 1727204609.26281: stdout chunk (state=3): >>># destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 40074 1727204609.26310: stdout chunk (state=3): >>># destroy ntpath <<< 40074 1727204609.26347: stdout chunk (state=3): >>># destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 40074 1727204609.26398: stdout chunk (state=3): >>># destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 40074 1727204609.26427: stdout chunk (state=3): >>># destroy selectors # destroy errno # destroy array # destroy datetime <<< 40074 1727204609.26479: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess <<< 40074 1727204609.26573: stdout chunk (state=3): >>># cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 40074 1727204609.26576: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform <<< 40074 1727204609.26667: stdout chunk (state=3): >>># cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc <<< 40074 1727204609.26671: stdout chunk (state=3): >>># destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os <<< 40074 1727204609.26733: stdout chunk (state=3): >>># destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time <<< 40074 1727204609.26737: stdout chunk (state=3): >>># cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins <<< 40074 1727204609.26824: stdout chunk (state=3): >>># destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 40074 1727204609.26925: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 40074 1727204609.26961: stdout chunk (state=3): >>># destroy _collections <<< 40074 1727204609.27006: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser <<< 40074 1727204609.27009: stdout chunk (state=3): >>># destroy tokenize <<< 40074 1727204609.27037: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 40074 1727204609.27103: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 40074 1727204609.27259: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 40074 1727204609.27294: stdout chunk (state=3): >>># destroy _random # destroy _weakref <<< 40074 1727204609.27334: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re <<< 40074 1727204609.27384: stdout chunk (state=3): >>># destroy itertools <<< 40074 1727204609.27387: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins <<< 40074 1727204609.27406: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 40074 1727204609.28048: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204609.28051: stdout chunk (state=3): >>><<< 40074 1727204609.28054: stderr chunk (state=3): >>><<< 40074 1727204609.28263: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31544ac4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315447bad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31544aea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31544bd0a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31544bdfd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315429be90> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315429bf50> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542d3860> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542d3ef0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542b3b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542b1280> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154299040> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542f7740> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542f6360> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542b2270> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315429af30> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154328740> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31542982c0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154328bf0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154328aa0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154328e30> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154296de0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154329520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31543291f0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315432a420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154344650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154345d60> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154346c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31543472c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31543461b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154347d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154347470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315432a480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31540cbcb0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31540f47a0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31540f4500> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31540f47d0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f31540f49b0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31540c9e50> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31540f5fa0> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31540f4c20> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315432ab70> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154122360> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315413a4b0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154177290> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154199a30> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31541773b0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315413b140> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153fb43b0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31541394f0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31540f6f00> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f3153fb4650> # zipimport: found 30 names in '/tmp/ansible_stat_payload_gxn02oc7/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315400a0f0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153fe0fe0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153fe0170> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153fe3f50> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154035a90> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154035820> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154035160> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f31540358b0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f315400ab10> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154036840> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3154036a80> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3154036fc0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e98ce0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153e9a990> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e9b350> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e9c530> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e9efc0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153e9f0e0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e9d280> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ea2ea0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ea1970> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ea16d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ea3b60> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e9d790> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153eeb0b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153eeb230> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153ef0e00> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ef0bc0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153ef3380> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ef14f0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ef6ba0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ef3530> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153ef7e60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153ef7860> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153ef7f20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153eeb530> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153efb5f0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153efc620> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ef9d60> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153efb0e0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153ef9940> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153f84830> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153f85520> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153efc6b0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153f854f0> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153f86300> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153f8e0f0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153f8ea50> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153f872c0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f3153f8d7f0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153f8ec60> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153e22e10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153d9cb90> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153d9bd70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f3153d9aab0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 40074 1727204609.28981: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204609.28985: _low_level_execute_command(): starting 40074 1727204609.28988: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204608.6456194-40190-183069344624111/ > /dev/null 2>&1 && sleep 0' 40074 1727204609.29394: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204609.29409: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204609.29510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204609.29527: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204609.29640: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204609.32578: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204609.32582: stdout chunk (state=3): >>><<< 40074 1727204609.32585: stderr chunk (state=3): >>><<< 40074 1727204609.32610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204609.32647: handler run complete 40074 1727204609.32669: attempt loop complete, returning result 40074 1727204609.32686: _execute() done 40074 1727204609.32753: dumping result to json 40074 1727204609.32757: done dumping result, returning 40074 1727204609.32759: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [12b410aa-8751-9fd7-2501-0000000000d2] 40074 1727204609.32762: sending task result for task 12b410aa-8751-9fd7-2501-0000000000d2 ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 40074 1727204609.32952: no more pending results, returning what we have 40074 1727204609.32956: results queue empty 40074 1727204609.32958: checking for any_errors_fatal 40074 1727204609.32966: done checking for any_errors_fatal 40074 1727204609.32967: checking for max_fail_percentage 40074 1727204609.32969: done checking for max_fail_percentage 40074 1727204609.32970: checking to see if all hosts have failed and the running result is not ok 40074 1727204609.32972: done checking to see if all hosts have failed 40074 1727204609.32973: getting the remaining hosts for this loop 40074 1727204609.32974: done getting the remaining hosts for this loop 40074 1727204609.32979: getting the next task for host managed-node2 40074 1727204609.32989: done getting next task for host managed-node2 40074 1727204609.33202: ^ task is: TASK: Set flag to indicate system is ostree 40074 1727204609.33206: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.33211: getting variables 40074 1727204609.33213: in VariableManager get_vars() 40074 1727204609.33248: Calling all_inventory to load vars for managed-node2 40074 1727204609.33252: Calling groups_inventory to load vars for managed-node2 40074 1727204609.33256: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.33270: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.33273: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.33278: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.33837: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000d2 40074 1727204609.33840: WORKER PROCESS EXITING 40074 1727204609.33869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.34531: done with get_vars() 40074 1727204609.34544: done getting variables 40074 1727204609.34655: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.755) 0:00:03.108 ***** 40074 1727204609.34687: entering _queue_task() for managed-node2/set_fact 40074 1727204609.34692: Creating lock for set_fact 40074 1727204609.34974: worker is 1 (out of 1 available) 40074 1727204609.34987: exiting _queue_task() for managed-node2/set_fact 40074 1727204609.35000: done queuing things up, now waiting for results queue to drain 40074 1727204609.35001: waiting for pending results... 40074 1727204609.35270: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 40074 1727204609.35365: in run() - task 12b410aa-8751-9fd7-2501-0000000000d3 40074 1727204609.35380: variable 'ansible_search_path' from source: unknown 40074 1727204609.35384: variable 'ansible_search_path' from source: unknown 40074 1727204609.35425: calling self._execute() 40074 1727204609.35510: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.35518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.35530: variable 'omit' from source: magic vars 40074 1727204609.36150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204609.36436: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204609.36493: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204609.36537: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204609.36571: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204609.36668: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204609.36701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204609.36735: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204609.36765: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204609.36901: Evaluated conditional (not __network_is_ostree is defined): True 40074 1727204609.36904: variable 'omit' from source: magic vars 40074 1727204609.36954: variable 'omit' from source: magic vars 40074 1727204609.37100: variable '__ostree_booted_stat' from source: set_fact 40074 1727204609.37158: variable 'omit' from source: magic vars 40074 1727204609.37186: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204609.37228: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204609.37294: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204609.37298: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204609.37300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204609.37306: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204609.37311: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.37316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.37447: Set connection var ansible_pipelining to False 40074 1727204609.37451: Set connection var ansible_shell_executable to /bin/sh 40074 1727204609.37455: Set connection var ansible_shell_type to sh 40074 1727204609.37463: Set connection var ansible_connection to ssh 40074 1727204609.37472: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204609.37480: Set connection var ansible_timeout to 10 40074 1727204609.37511: variable 'ansible_shell_executable' from source: unknown 40074 1727204609.37514: variable 'ansible_connection' from source: unknown 40074 1727204609.37517: variable 'ansible_module_compression' from source: unknown 40074 1727204609.37556: variable 'ansible_shell_type' from source: unknown 40074 1727204609.37559: variable 'ansible_shell_executable' from source: unknown 40074 1727204609.37561: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.37565: variable 'ansible_pipelining' from source: unknown 40074 1727204609.37571: variable 'ansible_timeout' from source: unknown 40074 1727204609.37573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.37666: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204609.37675: variable 'omit' from source: magic vars 40074 1727204609.37688: starting attempt loop 40074 1727204609.37692: running the handler 40074 1727204609.37769: handler run complete 40074 1727204609.37773: attempt loop complete, returning result 40074 1727204609.37776: _execute() done 40074 1727204609.37778: dumping result to json 40074 1727204609.37780: done dumping result, returning 40074 1727204609.37782: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [12b410aa-8751-9fd7-2501-0000000000d3] 40074 1727204609.37784: sending task result for task 12b410aa-8751-9fd7-2501-0000000000d3 40074 1727204609.37854: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000d3 40074 1727204609.37858: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 40074 1727204609.37919: no more pending results, returning what we have 40074 1727204609.37922: results queue empty 40074 1727204609.37924: checking for any_errors_fatal 40074 1727204609.37931: done checking for any_errors_fatal 40074 1727204609.37932: checking for max_fail_percentage 40074 1727204609.37934: done checking for max_fail_percentage 40074 1727204609.37935: checking to see if all hosts have failed and the running result is not ok 40074 1727204609.37936: done checking to see if all hosts have failed 40074 1727204609.37937: getting the remaining hosts for this loop 40074 1727204609.37939: done getting the remaining hosts for this loop 40074 1727204609.37943: getting the next task for host managed-node2 40074 1727204609.37953: done getting next task for host managed-node2 40074 1727204609.37956: ^ task is: TASK: Fix CentOS6 Base repo 40074 1727204609.37959: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.37964: getting variables 40074 1727204609.37966: in VariableManager get_vars() 40074 1727204609.37997: Calling all_inventory to load vars for managed-node2 40074 1727204609.38000: Calling groups_inventory to load vars for managed-node2 40074 1727204609.38004: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.38018: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.38021: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.38032: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.38494: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.38844: done with get_vars() 40074 1727204609.38856: done getting variables 40074 1727204609.38994: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.043) 0:00:03.152 ***** 40074 1727204609.39025: entering _queue_task() for managed-node2/copy 40074 1727204609.39315: worker is 1 (out of 1 available) 40074 1727204609.39329: exiting _queue_task() for managed-node2/copy 40074 1727204609.39340: done queuing things up, now waiting for results queue to drain 40074 1727204609.39342: waiting for pending results... 40074 1727204609.39713: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 40074 1727204609.39723: in run() - task 12b410aa-8751-9fd7-2501-0000000000d5 40074 1727204609.39728: variable 'ansible_search_path' from source: unknown 40074 1727204609.39734: variable 'ansible_search_path' from source: unknown 40074 1727204609.39767: calling self._execute() 40074 1727204609.39860: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.39867: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.39880: variable 'omit' from source: magic vars 40074 1727204609.40437: variable 'ansible_distribution' from source: facts 40074 1727204609.40571: Evaluated conditional (ansible_distribution == 'CentOS'): False 40074 1727204609.40575: when evaluation is False, skipping this task 40074 1727204609.40577: _execute() done 40074 1727204609.40579: dumping result to json 40074 1727204609.40581: done dumping result, returning 40074 1727204609.40583: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [12b410aa-8751-9fd7-2501-0000000000d5] 40074 1727204609.40585: sending task result for task 12b410aa-8751-9fd7-2501-0000000000d5 40074 1727204609.40659: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000d5 40074 1727204609.40663: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 40074 1727204609.40736: no more pending results, returning what we have 40074 1727204609.40740: results queue empty 40074 1727204609.40741: checking for any_errors_fatal 40074 1727204609.40746: done checking for any_errors_fatal 40074 1727204609.40747: checking for max_fail_percentage 40074 1727204609.40748: done checking for max_fail_percentage 40074 1727204609.40749: checking to see if all hosts have failed and the running result is not ok 40074 1727204609.40750: done checking to see if all hosts have failed 40074 1727204609.40751: getting the remaining hosts for this loop 40074 1727204609.40753: done getting the remaining hosts for this loop 40074 1727204609.40757: getting the next task for host managed-node2 40074 1727204609.40765: done getting next task for host managed-node2 40074 1727204609.40769: ^ task is: TASK: Include the task 'enable_epel.yml' 40074 1727204609.40772: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.40776: getting variables 40074 1727204609.40778: in VariableManager get_vars() 40074 1727204609.40809: Calling all_inventory to load vars for managed-node2 40074 1727204609.40813: Calling groups_inventory to load vars for managed-node2 40074 1727204609.40817: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.40830: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.40834: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.40838: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.41234: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.41525: done with get_vars() 40074 1727204609.41538: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.026) 0:00:03.178 ***** 40074 1727204609.41681: entering _queue_task() for managed-node2/include_tasks 40074 1727204609.42001: worker is 1 (out of 1 available) 40074 1727204609.42017: exiting _queue_task() for managed-node2/include_tasks 40074 1727204609.42034: done queuing things up, now waiting for results queue to drain 40074 1727204609.42035: waiting for pending results... 40074 1727204609.42530: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 40074 1727204609.42537: in run() - task 12b410aa-8751-9fd7-2501-0000000000d6 40074 1727204609.42541: variable 'ansible_search_path' from source: unknown 40074 1727204609.42544: variable 'ansible_search_path' from source: unknown 40074 1727204609.42546: calling self._execute() 40074 1727204609.42548: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.42551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.42554: variable 'omit' from source: magic vars 40074 1727204609.43085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204609.45514: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204609.45591: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204609.45637: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204609.45677: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204609.45708: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204609.45796: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204609.45834: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204609.45862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204609.45919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204609.45933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204609.46194: variable '__network_is_ostree' from source: set_fact 40074 1727204609.46197: Evaluated conditional (not __network_is_ostree | d(false)): True 40074 1727204609.46199: _execute() done 40074 1727204609.46201: dumping result to json 40074 1727204609.46203: done dumping result, returning 40074 1727204609.46205: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [12b410aa-8751-9fd7-2501-0000000000d6] 40074 1727204609.46207: sending task result for task 12b410aa-8751-9fd7-2501-0000000000d6 40074 1727204609.46276: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000d6 40074 1727204609.46278: WORKER PROCESS EXITING 40074 1727204609.46308: no more pending results, returning what we have 40074 1727204609.46312: in VariableManager get_vars() 40074 1727204609.46345: Calling all_inventory to load vars for managed-node2 40074 1727204609.46348: Calling groups_inventory to load vars for managed-node2 40074 1727204609.46351: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.46363: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.46366: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.46370: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.46664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.47044: done with get_vars() 40074 1727204609.47059: variable 'ansible_search_path' from source: unknown 40074 1727204609.47060: variable 'ansible_search_path' from source: unknown 40074 1727204609.47108: we have included files to process 40074 1727204609.47110: generating all_blocks data 40074 1727204609.47112: done generating all_blocks data 40074 1727204609.47118: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 40074 1727204609.47120: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 40074 1727204609.47123: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 40074 1727204609.48186: done processing included file 40074 1727204609.48188: iterating over new_blocks loaded from include file 40074 1727204609.48193: in VariableManager get_vars() 40074 1727204609.48208: done with get_vars() 40074 1727204609.48210: filtering new block on tags 40074 1727204609.48246: done filtering new block on tags 40074 1727204609.48256: in VariableManager get_vars() 40074 1727204609.48270: done with get_vars() 40074 1727204609.48272: filtering new block on tags 40074 1727204609.48288: done filtering new block on tags 40074 1727204609.48293: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 40074 1727204609.48299: extending task lists for all hosts with included blocks 40074 1727204609.48461: done extending task lists 40074 1727204609.48463: done processing included files 40074 1727204609.48464: results queue empty 40074 1727204609.48465: checking for any_errors_fatal 40074 1727204609.48476: done checking for any_errors_fatal 40074 1727204609.48477: checking for max_fail_percentage 40074 1727204609.48478: done checking for max_fail_percentage 40074 1727204609.48479: checking to see if all hosts have failed and the running result is not ok 40074 1727204609.48480: done checking to see if all hosts have failed 40074 1727204609.48481: getting the remaining hosts for this loop 40074 1727204609.48483: done getting the remaining hosts for this loop 40074 1727204609.48486: getting the next task for host managed-node2 40074 1727204609.48494: done getting next task for host managed-node2 40074 1727204609.48496: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 40074 1727204609.48500: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.48502: getting variables 40074 1727204609.48504: in VariableManager get_vars() 40074 1727204609.48513: Calling all_inventory to load vars for managed-node2 40074 1727204609.48516: Calling groups_inventory to load vars for managed-node2 40074 1727204609.48519: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.48526: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.48537: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.48542: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.48966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.49325: done with get_vars() 40074 1727204609.49337: done getting variables 40074 1727204609.49423: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 40074 1727204609.49686: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.080) 0:00:03.259 ***** 40074 1727204609.49751: entering _queue_task() for managed-node2/command 40074 1727204609.49753: Creating lock for command 40074 1727204609.50119: worker is 1 (out of 1 available) 40074 1727204609.50134: exiting _queue_task() for managed-node2/command 40074 1727204609.50147: done queuing things up, now waiting for results queue to drain 40074 1727204609.50148: waiting for pending results... 40074 1727204609.50517: running TaskExecutor() for managed-node2/TASK: Create EPEL 39 40074 1727204609.50562: in run() - task 12b410aa-8751-9fd7-2501-0000000000f0 40074 1727204609.50581: variable 'ansible_search_path' from source: unknown 40074 1727204609.50591: variable 'ansible_search_path' from source: unknown 40074 1727204609.50648: calling self._execute() 40074 1727204609.50750: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.50765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.50781: variable 'omit' from source: magic vars 40074 1727204609.51270: variable 'ansible_distribution' from source: facts 40074 1727204609.51292: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 40074 1727204609.51379: when evaluation is False, skipping this task 40074 1727204609.51387: _execute() done 40074 1727204609.51390: dumping result to json 40074 1727204609.51393: done dumping result, returning 40074 1727204609.51397: done running TaskExecutor() for managed-node2/TASK: Create EPEL 39 [12b410aa-8751-9fd7-2501-0000000000f0] 40074 1727204609.51399: sending task result for task 12b410aa-8751-9fd7-2501-0000000000f0 40074 1727204609.51478: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000f0 40074 1727204609.51484: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 40074 1727204609.51545: no more pending results, returning what we have 40074 1727204609.51549: results queue empty 40074 1727204609.51550: checking for any_errors_fatal 40074 1727204609.51551: done checking for any_errors_fatal 40074 1727204609.51552: checking for max_fail_percentage 40074 1727204609.51554: done checking for max_fail_percentage 40074 1727204609.51555: checking to see if all hosts have failed and the running result is not ok 40074 1727204609.51556: done checking to see if all hosts have failed 40074 1727204609.51557: getting the remaining hosts for this loop 40074 1727204609.51559: done getting the remaining hosts for this loop 40074 1727204609.51564: getting the next task for host managed-node2 40074 1727204609.51571: done getting next task for host managed-node2 40074 1727204609.51575: ^ task is: TASK: Install yum-utils package 40074 1727204609.51579: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.51583: getting variables 40074 1727204609.51585: in VariableManager get_vars() 40074 1727204609.51617: Calling all_inventory to load vars for managed-node2 40074 1727204609.51620: Calling groups_inventory to load vars for managed-node2 40074 1727204609.51624: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.51641: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.51645: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.51649: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.52188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.52607: done with get_vars() 40074 1727204609.52618: done getting variables 40074 1727204609.52738: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.030) 0:00:03.289 ***** 40074 1727204609.52771: entering _queue_task() for managed-node2/package 40074 1727204609.52774: Creating lock for package 40074 1727204609.53065: worker is 1 (out of 1 available) 40074 1727204609.53079: exiting _queue_task() for managed-node2/package 40074 1727204609.53093: done queuing things up, now waiting for results queue to drain 40074 1727204609.53094: waiting for pending results... 40074 1727204609.53566: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 40074 1727204609.53572: in run() - task 12b410aa-8751-9fd7-2501-0000000000f1 40074 1727204609.53575: variable 'ansible_search_path' from source: unknown 40074 1727204609.53577: variable 'ansible_search_path' from source: unknown 40074 1727204609.53658: calling self._execute() 40074 1727204609.53704: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.53717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.53736: variable 'omit' from source: magic vars 40074 1727204609.54215: variable 'ansible_distribution' from source: facts 40074 1727204609.54243: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 40074 1727204609.54252: when evaluation is False, skipping this task 40074 1727204609.54260: _execute() done 40074 1727204609.54269: dumping result to json 40074 1727204609.54277: done dumping result, returning 40074 1727204609.54291: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [12b410aa-8751-9fd7-2501-0000000000f1] 40074 1727204609.54316: sending task result for task 12b410aa-8751-9fd7-2501-0000000000f1 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 40074 1727204609.54582: no more pending results, returning what we have 40074 1727204609.54586: results queue empty 40074 1727204609.54587: checking for any_errors_fatal 40074 1727204609.54597: done checking for any_errors_fatal 40074 1727204609.54598: checking for max_fail_percentage 40074 1727204609.54600: done checking for max_fail_percentage 40074 1727204609.54600: checking to see if all hosts have failed and the running result is not ok 40074 1727204609.54602: done checking to see if all hosts have failed 40074 1727204609.54603: getting the remaining hosts for this loop 40074 1727204609.54604: done getting the remaining hosts for this loop 40074 1727204609.54608: getting the next task for host managed-node2 40074 1727204609.54614: done getting next task for host managed-node2 40074 1727204609.54617: ^ task is: TASK: Enable EPEL 7 40074 1727204609.54621: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.54624: getting variables 40074 1727204609.54626: in VariableManager get_vars() 40074 1727204609.54660: Calling all_inventory to load vars for managed-node2 40074 1727204609.54663: Calling groups_inventory to load vars for managed-node2 40074 1727204609.54667: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.54683: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.54687: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.54796: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000f1 40074 1727204609.54799: WORKER PROCESS EXITING 40074 1727204609.54812: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.55168: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.55546: done with get_vars() 40074 1727204609.55559: done getting variables 40074 1727204609.55641: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.029) 0:00:03.318 ***** 40074 1727204609.55685: entering _queue_task() for managed-node2/command 40074 1727204609.55990: worker is 1 (out of 1 available) 40074 1727204609.56097: exiting _queue_task() for managed-node2/command 40074 1727204609.56124: done queuing things up, now waiting for results queue to drain 40074 1727204609.56126: waiting for pending results... 40074 1727204609.56355: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 40074 1727204609.56481: in run() - task 12b410aa-8751-9fd7-2501-0000000000f2 40074 1727204609.56506: variable 'ansible_search_path' from source: unknown 40074 1727204609.56515: variable 'ansible_search_path' from source: unknown 40074 1727204609.56572: calling self._execute() 40074 1727204609.56681: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.56698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.56716: variable 'omit' from source: magic vars 40074 1727204609.57218: variable 'ansible_distribution' from source: facts 40074 1727204609.57241: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 40074 1727204609.57251: when evaluation is False, skipping this task 40074 1727204609.57260: _execute() done 40074 1727204609.57294: dumping result to json 40074 1727204609.57298: done dumping result, returning 40074 1727204609.57301: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [12b410aa-8751-9fd7-2501-0000000000f2] 40074 1727204609.57304: sending task result for task 12b410aa-8751-9fd7-2501-0000000000f2 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 40074 1727204609.57477: no more pending results, returning what we have 40074 1727204609.57481: results queue empty 40074 1727204609.57483: checking for any_errors_fatal 40074 1727204609.57494: done checking for any_errors_fatal 40074 1727204609.57495: checking for max_fail_percentage 40074 1727204609.57497: done checking for max_fail_percentage 40074 1727204609.57499: checking to see if all hosts have failed and the running result is not ok 40074 1727204609.57500: done checking to see if all hosts have failed 40074 1727204609.57501: getting the remaining hosts for this loop 40074 1727204609.57503: done getting the remaining hosts for this loop 40074 1727204609.57507: getting the next task for host managed-node2 40074 1727204609.57515: done getting next task for host managed-node2 40074 1727204609.57519: ^ task is: TASK: Enable EPEL 8 40074 1727204609.57523: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.57528: getting variables 40074 1727204609.57532: in VariableManager get_vars() 40074 1727204609.57565: Calling all_inventory to load vars for managed-node2 40074 1727204609.57569: Calling groups_inventory to load vars for managed-node2 40074 1727204609.57574: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.57707: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.57713: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.57719: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.58315: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000f2 40074 1727204609.58319: WORKER PROCESS EXITING 40074 1727204609.58457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.58850: done with get_vars() 40074 1727204609.58861: done getting variables 40074 1727204609.58938: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.032) 0:00:03.351 ***** 40074 1727204609.58970: entering _queue_task() for managed-node2/command 40074 1727204609.59374: worker is 1 (out of 1 available) 40074 1727204609.59387: exiting _queue_task() for managed-node2/command 40074 1727204609.59404: done queuing things up, now waiting for results queue to drain 40074 1727204609.59406: waiting for pending results... 40074 1727204609.59622: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 40074 1727204609.59801: in run() - task 12b410aa-8751-9fd7-2501-0000000000f3 40074 1727204609.59822: variable 'ansible_search_path' from source: unknown 40074 1727204609.59833: variable 'ansible_search_path' from source: unknown 40074 1727204609.59889: calling self._execute() 40074 1727204609.59997: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.60015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.60035: variable 'omit' from source: magic vars 40074 1727204609.60543: variable 'ansible_distribution' from source: facts 40074 1727204609.60569: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 40074 1727204609.60578: when evaluation is False, skipping this task 40074 1727204609.60587: _execute() done 40074 1727204609.60598: dumping result to json 40074 1727204609.60607: done dumping result, returning 40074 1727204609.60620: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [12b410aa-8751-9fd7-2501-0000000000f3] 40074 1727204609.60634: sending task result for task 12b410aa-8751-9fd7-2501-0000000000f3 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 40074 1727204609.60925: no more pending results, returning what we have 40074 1727204609.60929: results queue empty 40074 1727204609.60933: checking for any_errors_fatal 40074 1727204609.60940: done checking for any_errors_fatal 40074 1727204609.60942: checking for max_fail_percentage 40074 1727204609.60943: done checking for max_fail_percentage 40074 1727204609.60944: checking to see if all hosts have failed and the running result is not ok 40074 1727204609.60946: done checking to see if all hosts have failed 40074 1727204609.60947: getting the remaining hosts for this loop 40074 1727204609.60949: done getting the remaining hosts for this loop 40074 1727204609.60953: getting the next task for host managed-node2 40074 1727204609.60963: done getting next task for host managed-node2 40074 1727204609.60966: ^ task is: TASK: Enable EPEL 6 40074 1727204609.60978: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.60983: getting variables 40074 1727204609.60985: in VariableManager get_vars() 40074 1727204609.61019: Calling all_inventory to load vars for managed-node2 40074 1727204609.61023: Calling groups_inventory to load vars for managed-node2 40074 1727204609.61027: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.61094: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000f3 40074 1727204609.61097: WORKER PROCESS EXITING 40074 1727204609.61205: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.61210: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.61214: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.61517: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.61902: done with get_vars() 40074 1727204609.61914: done getting variables 40074 1727204609.61998: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.030) 0:00:03.382 ***** 40074 1727204609.62036: entering _queue_task() for managed-node2/copy 40074 1727204609.62424: worker is 1 (out of 1 available) 40074 1727204609.62437: exiting _queue_task() for managed-node2/copy 40074 1727204609.62448: done queuing things up, now waiting for results queue to drain 40074 1727204609.62450: waiting for pending results... 40074 1727204609.62619: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 40074 1727204609.62785: in run() - task 12b410aa-8751-9fd7-2501-0000000000f5 40074 1727204609.62790: variable 'ansible_search_path' from source: unknown 40074 1727204609.62794: variable 'ansible_search_path' from source: unknown 40074 1727204609.62834: calling self._execute() 40074 1727204609.62957: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.62961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.62972: variable 'omit' from source: magic vars 40074 1727204609.63550: variable 'ansible_distribution' from source: facts 40074 1727204609.63606: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 40074 1727204609.63610: when evaluation is False, skipping this task 40074 1727204609.63617: _execute() done 40074 1727204609.63619: dumping result to json 40074 1727204609.63621: done dumping result, returning 40074 1727204609.63624: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [12b410aa-8751-9fd7-2501-0000000000f5] 40074 1727204609.63626: sending task result for task 12b410aa-8751-9fd7-2501-0000000000f5 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 40074 1727204609.63805: no more pending results, returning what we have 40074 1727204609.63809: results queue empty 40074 1727204609.63810: checking for any_errors_fatal 40074 1727204609.63816: done checking for any_errors_fatal 40074 1727204609.63818: checking for max_fail_percentage 40074 1727204609.63819: done checking for max_fail_percentage 40074 1727204609.63820: checking to see if all hosts have failed and the running result is not ok 40074 1727204609.63822: done checking to see if all hosts have failed 40074 1727204609.63823: getting the remaining hosts for this loop 40074 1727204609.63824: done getting the remaining hosts for this loop 40074 1727204609.63829: getting the next task for host managed-node2 40074 1727204609.63842: done getting next task for host managed-node2 40074 1727204609.63846: ^ task is: TASK: Set network provider to 'nm' 40074 1727204609.63849: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.63854: getting variables 40074 1727204609.63855: in VariableManager get_vars() 40074 1727204609.63892: Calling all_inventory to load vars for managed-node2 40074 1727204609.63896: Calling groups_inventory to load vars for managed-node2 40074 1727204609.63900: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.63916: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.63920: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.63925: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.64480: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000f5 40074 1727204609.64483: WORKER PROCESS EXITING 40074 1727204609.64511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.64905: done with get_vars() 40074 1727204609.64916: done getting variables 40074 1727204609.64992: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:13 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.029) 0:00:03.411 ***** 40074 1727204609.65022: entering _queue_task() for managed-node2/set_fact 40074 1727204609.65279: worker is 1 (out of 1 available) 40074 1727204609.65410: exiting _queue_task() for managed-node2/set_fact 40074 1727204609.65420: done queuing things up, now waiting for results queue to drain 40074 1727204609.65422: waiting for pending results... 40074 1727204609.65586: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 40074 1727204609.65698: in run() - task 12b410aa-8751-9fd7-2501-000000000007 40074 1727204609.65718: variable 'ansible_search_path' from source: unknown 40074 1727204609.65772: calling self._execute() 40074 1727204609.65874: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.65887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.65908: variable 'omit' from source: magic vars 40074 1727204609.66051: variable 'omit' from source: magic vars 40074 1727204609.66107: variable 'omit' from source: magic vars 40074 1727204609.66155: variable 'omit' from source: magic vars 40074 1727204609.66217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204609.66267: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204609.66306: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204609.66335: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204609.66389: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204609.66402: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204609.66413: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.66422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.66565: Set connection var ansible_pipelining to False 40074 1727204609.66608: Set connection var ansible_shell_executable to /bin/sh 40074 1727204609.66612: Set connection var ansible_shell_type to sh 40074 1727204609.66614: Set connection var ansible_connection to ssh 40074 1727204609.66616: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204609.66620: Set connection var ansible_timeout to 10 40074 1727204609.66655: variable 'ansible_shell_executable' from source: unknown 40074 1727204609.66693: variable 'ansible_connection' from source: unknown 40074 1727204609.66697: variable 'ansible_module_compression' from source: unknown 40074 1727204609.66700: variable 'ansible_shell_type' from source: unknown 40074 1727204609.66702: variable 'ansible_shell_executable' from source: unknown 40074 1727204609.66705: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.66707: variable 'ansible_pipelining' from source: unknown 40074 1727204609.66716: variable 'ansible_timeout' from source: unknown 40074 1727204609.66719: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.66901: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204609.66938: variable 'omit' from source: magic vars 40074 1727204609.66942: starting attempt loop 40074 1727204609.67044: running the handler 40074 1727204609.67048: handler run complete 40074 1727204609.67051: attempt loop complete, returning result 40074 1727204609.67053: _execute() done 40074 1727204609.67057: dumping result to json 40074 1727204609.67059: done dumping result, returning 40074 1727204609.67061: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [12b410aa-8751-9fd7-2501-000000000007] 40074 1727204609.67063: sending task result for task 12b410aa-8751-9fd7-2501-000000000007 40074 1727204609.67129: done sending task result for task 12b410aa-8751-9fd7-2501-000000000007 40074 1727204609.67135: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 40074 1727204609.67203: no more pending results, returning what we have 40074 1727204609.67206: results queue empty 40074 1727204609.67207: checking for any_errors_fatal 40074 1727204609.67214: done checking for any_errors_fatal 40074 1727204609.67215: checking for max_fail_percentage 40074 1727204609.67217: done checking for max_fail_percentage 40074 1727204609.67218: checking to see if all hosts have failed and the running result is not ok 40074 1727204609.67220: done checking to see if all hosts have failed 40074 1727204609.67221: getting the remaining hosts for this loop 40074 1727204609.67222: done getting the remaining hosts for this loop 40074 1727204609.67227: getting the next task for host managed-node2 40074 1727204609.67236: done getting next task for host managed-node2 40074 1727204609.67239: ^ task is: TASK: meta (flush_handlers) 40074 1727204609.67241: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.67246: getting variables 40074 1727204609.67248: in VariableManager get_vars() 40074 1727204609.67393: Calling all_inventory to load vars for managed-node2 40074 1727204609.67397: Calling groups_inventory to load vars for managed-node2 40074 1727204609.67493: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.67504: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.67508: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.67518: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.67808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.68173: done with get_vars() 40074 1727204609.68185: done getting variables 40074 1727204609.68270: in VariableManager get_vars() 40074 1727204609.68288: Calling all_inventory to load vars for managed-node2 40074 1727204609.68294: Calling groups_inventory to load vars for managed-node2 40074 1727204609.68297: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.68303: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.68306: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.68310: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.68559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.68922: done with get_vars() 40074 1727204609.68949: done queuing things up, now waiting for results queue to drain 40074 1727204609.68952: results queue empty 40074 1727204609.68953: checking for any_errors_fatal 40074 1727204609.68955: done checking for any_errors_fatal 40074 1727204609.68957: checking for max_fail_percentage 40074 1727204609.68958: done checking for max_fail_percentage 40074 1727204609.68959: checking to see if all hosts have failed and the running result is not ok 40074 1727204609.68960: done checking to see if all hosts have failed 40074 1727204609.68961: getting the remaining hosts for this loop 40074 1727204609.68962: done getting the remaining hosts for this loop 40074 1727204609.68965: getting the next task for host managed-node2 40074 1727204609.68970: done getting next task for host managed-node2 40074 1727204609.68972: ^ task is: TASK: meta (flush_handlers) 40074 1727204609.68973: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.68981: getting variables 40074 1727204609.68983: in VariableManager get_vars() 40074 1727204609.68994: Calling all_inventory to load vars for managed-node2 40074 1727204609.68997: Calling groups_inventory to load vars for managed-node2 40074 1727204609.69000: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.69006: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.69009: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.69013: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.69273: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.69662: done with get_vars() 40074 1727204609.69672: done getting variables 40074 1727204609.69739: in VariableManager get_vars() 40074 1727204609.69749: Calling all_inventory to load vars for managed-node2 40074 1727204609.69752: Calling groups_inventory to load vars for managed-node2 40074 1727204609.69755: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.69760: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.69763: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.69767: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.70013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.70391: done with get_vars() 40074 1727204609.70405: done queuing things up, now waiting for results queue to drain 40074 1727204609.70408: results queue empty 40074 1727204609.70409: checking for any_errors_fatal 40074 1727204609.70410: done checking for any_errors_fatal 40074 1727204609.70411: checking for max_fail_percentage 40074 1727204609.70412: done checking for max_fail_percentage 40074 1727204609.70413: checking to see if all hosts have failed and the running result is not ok 40074 1727204609.70414: done checking to see if all hosts have failed 40074 1727204609.70415: getting the remaining hosts for this loop 40074 1727204609.70416: done getting the remaining hosts for this loop 40074 1727204609.70419: getting the next task for host managed-node2 40074 1727204609.70422: done getting next task for host managed-node2 40074 1727204609.70423: ^ task is: None 40074 1727204609.70425: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.70426: done queuing things up, now waiting for results queue to drain 40074 1727204609.70427: results queue empty 40074 1727204609.70428: checking for any_errors_fatal 40074 1727204609.70429: done checking for any_errors_fatal 40074 1727204609.70432: checking for max_fail_percentage 40074 1727204609.70434: done checking for max_fail_percentage 40074 1727204609.70435: checking to see if all hosts have failed and the running result is not ok 40074 1727204609.70436: done checking to see if all hosts have failed 40074 1727204609.70437: getting the next task for host managed-node2 40074 1727204609.70440: done getting next task for host managed-node2 40074 1727204609.70441: ^ task is: None 40074 1727204609.70443: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.70500: in VariableManager get_vars() 40074 1727204609.70527: done with get_vars() 40074 1727204609.70538: in VariableManager get_vars() 40074 1727204609.70559: done with get_vars() 40074 1727204609.70584: variable 'omit' from source: magic vars 40074 1727204609.70624: in VariableManager get_vars() 40074 1727204609.70647: done with get_vars() 40074 1727204609.70679: variable 'omit' from source: magic vars PLAY [Test output device of routes] ******************************************** 40074 1727204609.71397: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 40074 1727204609.71424: getting the remaining hosts for this loop 40074 1727204609.71425: done getting the remaining hosts for this loop 40074 1727204609.71428: getting the next task for host managed-node2 40074 1727204609.71434: done getting next task for host managed-node2 40074 1727204609.71437: ^ task is: TASK: Gathering Facts 40074 1727204609.71443: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204609.71446: getting variables 40074 1727204609.71447: in VariableManager get_vars() 40074 1727204609.71467: Calling all_inventory to load vars for managed-node2 40074 1727204609.71470: Calling groups_inventory to load vars for managed-node2 40074 1727204609.71473: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204609.71479: Calling all_plugins_play to load vars for managed-node2 40074 1727204609.71498: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204609.71503: Calling groups_plugins_play to load vars for managed-node2 40074 1727204609.71758: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204609.72143: done with get_vars() 40074 1727204609.72152: done getting variables 40074 1727204609.72203: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 Tuesday 24 September 2024 15:03:29 -0400 (0:00:00.072) 0:00:03.484 ***** 40074 1727204609.72236: entering _queue_task() for managed-node2/gather_facts 40074 1727204609.72564: worker is 1 (out of 1 available) 40074 1727204609.72576: exiting _queue_task() for managed-node2/gather_facts 40074 1727204609.72593: done queuing things up, now waiting for results queue to drain 40074 1727204609.72595: waiting for pending results... 40074 1727204609.72856: running TaskExecutor() for managed-node2/TASK: Gathering Facts 40074 1727204609.72996: in run() - task 12b410aa-8751-9fd7-2501-00000000011b 40074 1727204609.73002: variable 'ansible_search_path' from source: unknown 40074 1727204609.73133: calling self._execute() 40074 1727204609.73143: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.73158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.73175: variable 'omit' from source: magic vars 40074 1727204609.73617: variable 'ansible_distribution_major_version' from source: facts 40074 1727204609.73637: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204609.73648: variable 'omit' from source: magic vars 40074 1727204609.73685: variable 'omit' from source: magic vars 40074 1727204609.73739: variable 'omit' from source: magic vars 40074 1727204609.73807: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204609.73842: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204609.73870: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204609.73915: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204609.73923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204609.73994: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204609.73998: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.74000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.74118: Set connection var ansible_pipelining to False 40074 1727204609.74138: Set connection var ansible_shell_executable to /bin/sh 40074 1727204609.74147: Set connection var ansible_shell_type to sh 40074 1727204609.74155: Set connection var ansible_connection to ssh 40074 1727204609.74168: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204609.74194: Set connection var ansible_timeout to 10 40074 1727204609.74217: variable 'ansible_shell_executable' from source: unknown 40074 1727204609.74240: variable 'ansible_connection' from source: unknown 40074 1727204609.74244: variable 'ansible_module_compression' from source: unknown 40074 1727204609.74247: variable 'ansible_shell_type' from source: unknown 40074 1727204609.74349: variable 'ansible_shell_executable' from source: unknown 40074 1727204609.74353: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204609.74356: variable 'ansible_pipelining' from source: unknown 40074 1727204609.74358: variable 'ansible_timeout' from source: unknown 40074 1727204609.74361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204609.74521: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204609.74542: variable 'omit' from source: magic vars 40074 1727204609.74554: starting attempt loop 40074 1727204609.74568: running the handler 40074 1727204609.74592: variable 'ansible_facts' from source: unknown 40074 1727204609.74620: _low_level_execute_command(): starting 40074 1727204609.74633: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204609.75540: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204609.75581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204609.75694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204609.75712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204609.75802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204609.78265: stdout chunk (state=3): >>>/root <<< 40074 1727204609.78530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204609.78534: stdout chunk (state=3): >>><<< 40074 1727204609.78536: stderr chunk (state=3): >>><<< 40074 1727204609.78669: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204609.78673: _low_level_execute_command(): starting 40074 1727204609.78676: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092 `" && echo ansible-tmp-1727204609.7856433-40229-245819708737092="` echo /root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092 `" ) && sleep 0' 40074 1727204609.79357: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204609.79361: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204609.79461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 40074 1727204609.79486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204609.79505: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204609.79585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204609.82475: stdout chunk (state=3): >>>ansible-tmp-1727204609.7856433-40229-245819708737092=/root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092 <<< 40074 1727204609.82710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204609.82793: stderr chunk (state=3): >>><<< 40074 1727204609.82885: stdout chunk (state=3): >>><<< 40074 1727204609.82889: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204609.7856433-40229-245819708737092=/root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204609.82893: variable 'ansible_module_compression' from source: unknown 40074 1727204609.82947: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 40074 1727204609.83027: variable 'ansible_facts' from source: unknown 40074 1727204609.83255: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092/AnsiballZ_setup.py 40074 1727204609.84008: Sending initial data 40074 1727204609.84011: Sent initial data (154 bytes) 40074 1727204609.85010: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204609.85203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 40074 1727204609.85219: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204609.85229: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204609.85507: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204609.87739: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204609.87852: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204609.87933: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp1fg12plm /root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092/AnsiballZ_setup.py <<< 40074 1727204609.87937: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092/AnsiballZ_setup.py" <<< 40074 1727204609.88111: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp1fg12plm" to remote "/root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092/AnsiballZ_setup.py" <<< 40074 1727204609.90622: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204609.90848: stderr chunk (state=3): >>><<< 40074 1727204609.90851: stdout chunk (state=3): >>><<< 40074 1727204609.90854: done transferring module to remote 40074 1727204609.90856: _low_level_execute_command(): starting 40074 1727204609.90859: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092/ /root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092/AnsiballZ_setup.py && sleep 0' 40074 1727204609.91416: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204609.91430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204609.91449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204609.91469: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204609.91583: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 40074 1727204609.91607: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204609.91622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204609.91717: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204609.94637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204609.94744: stderr chunk (state=3): >>><<< 40074 1727204609.94768: stdout chunk (state=3): >>><<< 40074 1727204609.94797: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204609.94900: _low_level_execute_command(): starting 40074 1727204609.94904: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092/AnsiballZ_setup.py && sleep 0' 40074 1727204609.95519: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204609.95543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204609.95561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204609.95677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204609.95708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204609.95803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204610.91470: stdout chunk (state=3): >>> <<< 40074 1727204610.91528: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXE<<< 40074 1727204610.91556: stdout chunk (state=3): >>>TAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2791, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 926, "free": 2791}, "nocache": {"free": 3452, "used": 265}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "ven<<< 40074 1727204610.91813: stdout chunk (state=3): >>>dor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1114, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251124699136, "block_size": 4096, "block_total": 64479564, "block_available": 61309741, "block_used": 3169823, "inode_total": 16384000, "inode_available": 16302098, "inode_used": 81902, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "rpltstbr", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv<<< 40074 1727204610.91840: stdout chunk (state=3): >>>6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "ee:fa:4b:42:85:80", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload":<<< 40074 1727204610.91876: stdout chunk (state=3): >>> "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_loadavg": {"1m": 0.4453125, "5m": 0.60107421875, "15m": 0.46630859375}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "30", "epoch": "1727204610", "epoch_int": "1727204610", "date": "2024-09-24", "time": "15:03:30", "iso8601_micro": "2024-09-24T19:03:30.907768Z", "iso8601": "2024-09-24T19:03:30Z", "iso8601_basic": "20240924T150330907768", "iso8601_basic_short": "20240924T150330", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_hostnqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 40074 1727204610.94841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204610.94905: stderr chunk (state=3): >>><<< 40074 1727204610.94908: stdout chunk (state=3): >>><<< 40074 1727204610.94945: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_iscsi_iqn": "", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2791, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 926, "free": 2791}, "nocache": {"free": 3452, "used": 265}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 1114, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251124699136, "block_size": 4096, "block_total": 64479564, "block_available": 61309741, "block_used": 3169823, "inode_total": 16384000, "inode_available": 16302098, "inode_used": 81902, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "rpltstbr", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_rpltstbr": {"device": "rpltstbr", "macaddress": "ee:fa:4b:42:85:80", "mtu": 1500, "active": false, "type": "bridge", "interfaces": [], "id": "8000.000000000000", "stp": false, "speed": -1, "promisc": false, "ipv4": {"address": "192.0.2.72", "broadcast": "", "netmask": "255.255.255.254", "network": "192.0.2.72", "prefix": "31"}, "features": {"rx_checksumming": "off [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "on", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "on", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "on", "tx_fcoe_segmentation": "on", "tx_gre_segmentation": "on", "tx_gre_csum_segmentation": "on", "tx_ipxip4_segmentation": "on", "tx_ipxip6_segmentation": "on", "tx_udp_tnl_segmentation": "on", "tx_udp_tnl_csum_segmentation": "on", "tx_gso_partial": "on", "tx_tunnel_remcsum_segmentation": "on", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "on", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "on", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159", "192.0.2.72"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "192.0.2.72"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_loadavg": {"1m": 0.4453125, "5m": 0.60107421875, "15m": 0.46630859375}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_lsb": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "15", "minute": "03", "second": "30", "epoch": "1727204610", "epoch_int": "1727204610", "date": "2024-09-24", "time": "15:03:30", "iso8601_micro": "2024-09-24T19:03:30.907768Z", "iso8601": "2024-09-24T19:03:30Z", "iso8601_basic": "20240924T150330907768", "iso8601_basic_short": "20240924T150330", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_hostnqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204610.95292: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204610.95311: _low_level_execute_command(): starting 40074 1727204610.95319: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204609.7856433-40229-245819708737092/ > /dev/null 2>&1 && sleep 0' 40074 1727204610.95786: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204610.95800: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204610.95806: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204610.95809: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204610.95857: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204610.95862: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204610.95864: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204610.95917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204610.98633: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204610.98679: stderr chunk (state=3): >>><<< 40074 1727204610.98682: stdout chunk (state=3): >>><<< 40074 1727204610.98702: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204610.98717: handler run complete 40074 1727204610.98847: variable 'ansible_facts' from source: unknown 40074 1727204610.98942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204610.99269: variable 'ansible_facts' from source: unknown 40074 1727204610.99350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204610.99485: attempt loop complete, returning result 40074 1727204610.99492: _execute() done 40074 1727204610.99497: dumping result to json 40074 1727204610.99525: done dumping result, returning 40074 1727204610.99535: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-9fd7-2501-00000000011b] 40074 1727204610.99538: sending task result for task 12b410aa-8751-9fd7-2501-00000000011b ok: [managed-node2] 40074 1727204611.00233: no more pending results, returning what we have 40074 1727204611.00235: results queue empty 40074 1727204611.00236: checking for any_errors_fatal 40074 1727204611.00237: done checking for any_errors_fatal 40074 1727204611.00238: checking for max_fail_percentage 40074 1727204611.00239: done checking for max_fail_percentage 40074 1727204611.00239: checking to see if all hosts have failed and the running result is not ok 40074 1727204611.00240: done checking to see if all hosts have failed 40074 1727204611.00241: getting the remaining hosts for this loop 40074 1727204611.00241: done getting the remaining hosts for this loop 40074 1727204611.00244: getting the next task for host managed-node2 40074 1727204611.00248: done getting next task for host managed-node2 40074 1727204611.00250: ^ task is: TASK: meta (flush_handlers) 40074 1727204611.00251: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204611.00254: getting variables 40074 1727204611.00255: in VariableManager get_vars() 40074 1727204611.00280: Calling all_inventory to load vars for managed-node2 40074 1727204611.00282: Calling groups_inventory to load vars for managed-node2 40074 1727204611.00284: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.00296: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.00298: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.00301: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.00443: done sending task result for task 12b410aa-8751-9fd7-2501-00000000011b 40074 1727204611.00449: WORKER PROCESS EXITING 40074 1727204611.00468: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.00682: done with get_vars() 40074 1727204611.00693: done getting variables 40074 1727204611.00750: in VariableManager get_vars() 40074 1727204611.00763: Calling all_inventory to load vars for managed-node2 40074 1727204611.00765: Calling groups_inventory to load vars for managed-node2 40074 1727204611.00768: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.00772: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.00774: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.00776: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.00931: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.01122: done with get_vars() 40074 1727204611.01134: done queuing things up, now waiting for results queue to drain 40074 1727204611.01135: results queue empty 40074 1727204611.01136: checking for any_errors_fatal 40074 1727204611.01139: done checking for any_errors_fatal 40074 1727204611.01139: checking for max_fail_percentage 40074 1727204611.01140: done checking for max_fail_percentage 40074 1727204611.01141: checking to see if all hosts have failed and the running result is not ok 40074 1727204611.01145: done checking to see if all hosts have failed 40074 1727204611.01146: getting the remaining hosts for this loop 40074 1727204611.01146: done getting the remaining hosts for this loop 40074 1727204611.01148: getting the next task for host managed-node2 40074 1727204611.01151: done getting next task for host managed-node2 40074 1727204611.01153: ^ task is: TASK: Set type and interface0 40074 1727204611.01154: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204611.01156: getting variables 40074 1727204611.01157: in VariableManager get_vars() 40074 1727204611.01167: Calling all_inventory to load vars for managed-node2 40074 1727204611.01169: Calling groups_inventory to load vars for managed-node2 40074 1727204611.01171: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.01174: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.01176: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.01178: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.01312: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.01501: done with get_vars() 40074 1727204611.01509: done getting variables 40074 1727204611.01548: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set type and interface0] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:11 Tuesday 24 September 2024 15:03:31 -0400 (0:00:01.293) 0:00:04.777 ***** 40074 1727204611.01571: entering _queue_task() for managed-node2/set_fact 40074 1727204611.01809: worker is 1 (out of 1 available) 40074 1727204611.01823: exiting _queue_task() for managed-node2/set_fact 40074 1727204611.01835: done queuing things up, now waiting for results queue to drain 40074 1727204611.01837: waiting for pending results... 40074 1727204611.02009: running TaskExecutor() for managed-node2/TASK: Set type and interface0 40074 1727204611.02080: in run() - task 12b410aa-8751-9fd7-2501-00000000000b 40074 1727204611.02091: variable 'ansible_search_path' from source: unknown 40074 1727204611.02123: calling self._execute() 40074 1727204611.02201: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.02209: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.02220: variable 'omit' from source: magic vars 40074 1727204611.02532: variable 'ansible_distribution_major_version' from source: facts 40074 1727204611.02546: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204611.02553: variable 'omit' from source: magic vars 40074 1727204611.02577: variable 'omit' from source: magic vars 40074 1727204611.02607: variable 'type' from source: play vars 40074 1727204611.02673: variable 'type' from source: play vars 40074 1727204611.02701: variable 'interface0' from source: play vars 40074 1727204611.02761: variable 'interface0' from source: play vars 40074 1727204611.02776: variable 'omit' from source: magic vars 40074 1727204611.02812: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204611.02848: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204611.02866: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204611.02883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204611.02896: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204611.02922: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204611.02927: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.02930: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.03021: Set connection var ansible_pipelining to False 40074 1727204611.03027: Set connection var ansible_shell_executable to /bin/sh 40074 1727204611.03030: Set connection var ansible_shell_type to sh 40074 1727204611.03036: Set connection var ansible_connection to ssh 40074 1727204611.03045: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204611.03058: Set connection var ansible_timeout to 10 40074 1727204611.03077: variable 'ansible_shell_executable' from source: unknown 40074 1727204611.03080: variable 'ansible_connection' from source: unknown 40074 1727204611.03084: variable 'ansible_module_compression' from source: unknown 40074 1727204611.03088: variable 'ansible_shell_type' from source: unknown 40074 1727204611.03093: variable 'ansible_shell_executable' from source: unknown 40074 1727204611.03096: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.03102: variable 'ansible_pipelining' from source: unknown 40074 1727204611.03105: variable 'ansible_timeout' from source: unknown 40074 1727204611.03111: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.03233: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204611.03246: variable 'omit' from source: magic vars 40074 1727204611.03252: starting attempt loop 40074 1727204611.03257: running the handler 40074 1727204611.03275: handler run complete 40074 1727204611.03280: attempt loop complete, returning result 40074 1727204611.03283: _execute() done 40074 1727204611.03286: dumping result to json 40074 1727204611.03294: done dumping result, returning 40074 1727204611.03300: done running TaskExecutor() for managed-node2/TASK: Set type and interface0 [12b410aa-8751-9fd7-2501-00000000000b] 40074 1727204611.03305: sending task result for task 12b410aa-8751-9fd7-2501-00000000000b 40074 1727204611.03392: done sending task result for task 12b410aa-8751-9fd7-2501-00000000000b 40074 1727204611.03395: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "interface": "ethtest0", "type": "veth" }, "changed": false } 40074 1727204611.03452: no more pending results, returning what we have 40074 1727204611.03455: results queue empty 40074 1727204611.03456: checking for any_errors_fatal 40074 1727204611.03459: done checking for any_errors_fatal 40074 1727204611.03460: checking for max_fail_percentage 40074 1727204611.03462: done checking for max_fail_percentage 40074 1727204611.03463: checking to see if all hosts have failed and the running result is not ok 40074 1727204611.03465: done checking to see if all hosts have failed 40074 1727204611.03466: getting the remaining hosts for this loop 40074 1727204611.03467: done getting the remaining hosts for this loop 40074 1727204611.03471: getting the next task for host managed-node2 40074 1727204611.03476: done getting next task for host managed-node2 40074 1727204611.03479: ^ task is: TASK: Show interfaces 40074 1727204611.03480: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204611.03484: getting variables 40074 1727204611.03486: in VariableManager get_vars() 40074 1727204611.03532: Calling all_inventory to load vars for managed-node2 40074 1727204611.03535: Calling groups_inventory to load vars for managed-node2 40074 1727204611.03538: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.03549: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.03552: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.03555: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.03769: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.03967: done with get_vars() 40074 1727204611.03974: done getting variables TASK [Show interfaces] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:15 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.024) 0:00:04.802 ***** 40074 1727204611.04048: entering _queue_task() for managed-node2/include_tasks 40074 1727204611.04266: worker is 1 (out of 1 available) 40074 1727204611.04281: exiting _queue_task() for managed-node2/include_tasks 40074 1727204611.04295: done queuing things up, now waiting for results queue to drain 40074 1727204611.04297: waiting for pending results... 40074 1727204611.04456: running TaskExecutor() for managed-node2/TASK: Show interfaces 40074 1727204611.04519: in run() - task 12b410aa-8751-9fd7-2501-00000000000c 40074 1727204611.04541: variable 'ansible_search_path' from source: unknown 40074 1727204611.04566: calling self._execute() 40074 1727204611.04643: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.04647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.04657: variable 'omit' from source: magic vars 40074 1727204611.04958: variable 'ansible_distribution_major_version' from source: facts 40074 1727204611.04971: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204611.04978: _execute() done 40074 1727204611.04990: dumping result to json 40074 1727204611.04994: done dumping result, returning 40074 1727204611.05003: done running TaskExecutor() for managed-node2/TASK: Show interfaces [12b410aa-8751-9fd7-2501-00000000000c] 40074 1727204611.05006: sending task result for task 12b410aa-8751-9fd7-2501-00000000000c 40074 1727204611.05104: done sending task result for task 12b410aa-8751-9fd7-2501-00000000000c 40074 1727204611.05107: WORKER PROCESS EXITING 40074 1727204611.05141: no more pending results, returning what we have 40074 1727204611.05145: in VariableManager get_vars() 40074 1727204611.05185: Calling all_inventory to load vars for managed-node2 40074 1727204611.05188: Calling groups_inventory to load vars for managed-node2 40074 1727204611.05193: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.05204: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.05207: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.05210: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.05388: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.05579: done with get_vars() 40074 1727204611.05588: variable 'ansible_search_path' from source: unknown 40074 1727204611.05600: we have included files to process 40074 1727204611.05601: generating all_blocks data 40074 1727204611.05602: done generating all_blocks data 40074 1727204611.05603: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 40074 1727204611.05604: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 40074 1727204611.05606: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 40074 1727204611.05728: in VariableManager get_vars() 40074 1727204611.05747: done with get_vars() 40074 1727204611.05840: done processing included file 40074 1727204611.05842: iterating over new_blocks loaded from include file 40074 1727204611.05843: in VariableManager get_vars() 40074 1727204611.05857: done with get_vars() 40074 1727204611.05858: filtering new block on tags 40074 1727204611.05871: done filtering new block on tags 40074 1727204611.05873: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 40074 1727204611.05877: extending task lists for all hosts with included blocks 40074 1727204611.05995: done extending task lists 40074 1727204611.05996: done processing included files 40074 1727204611.05997: results queue empty 40074 1727204611.05997: checking for any_errors_fatal 40074 1727204611.06000: done checking for any_errors_fatal 40074 1727204611.06000: checking for max_fail_percentage 40074 1727204611.06001: done checking for max_fail_percentage 40074 1727204611.06002: checking to see if all hosts have failed and the running result is not ok 40074 1727204611.06002: done checking to see if all hosts have failed 40074 1727204611.06003: getting the remaining hosts for this loop 40074 1727204611.06004: done getting the remaining hosts for this loop 40074 1727204611.06006: getting the next task for host managed-node2 40074 1727204611.06009: done getting next task for host managed-node2 40074 1727204611.06010: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 40074 1727204611.06012: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204611.06014: getting variables 40074 1727204611.06015: in VariableManager get_vars() 40074 1727204611.06028: Calling all_inventory to load vars for managed-node2 40074 1727204611.06030: Calling groups_inventory to load vars for managed-node2 40074 1727204611.06057: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.06062: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.06064: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.06066: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.06198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.06387: done with get_vars() 40074 1727204611.06397: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.024) 0:00:04.826 ***** 40074 1727204611.06452: entering _queue_task() for managed-node2/include_tasks 40074 1727204611.06663: worker is 1 (out of 1 available) 40074 1727204611.06677: exiting _queue_task() for managed-node2/include_tasks 40074 1727204611.06690: done queuing things up, now waiting for results queue to drain 40074 1727204611.06692: waiting for pending results... 40074 1727204611.06847: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 40074 1727204611.06917: in run() - task 12b410aa-8751-9fd7-2501-000000000135 40074 1727204611.06937: variable 'ansible_search_path' from source: unknown 40074 1727204611.06941: variable 'ansible_search_path' from source: unknown 40074 1727204611.06963: calling self._execute() 40074 1727204611.07041: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.07045: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.07054: variable 'omit' from source: magic vars 40074 1727204611.07351: variable 'ansible_distribution_major_version' from source: facts 40074 1727204611.07363: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204611.07375: _execute() done 40074 1727204611.07385: dumping result to json 40074 1727204611.07391: done dumping result, returning 40074 1727204611.07394: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-9fd7-2501-000000000135] 40074 1727204611.07401: sending task result for task 12b410aa-8751-9fd7-2501-000000000135 40074 1727204611.07489: done sending task result for task 12b410aa-8751-9fd7-2501-000000000135 40074 1727204611.07492: WORKER PROCESS EXITING 40074 1727204611.07523: no more pending results, returning what we have 40074 1727204611.07527: in VariableManager get_vars() 40074 1727204611.07566: Calling all_inventory to load vars for managed-node2 40074 1727204611.07570: Calling groups_inventory to load vars for managed-node2 40074 1727204611.07573: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.07583: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.07585: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.07590: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.07762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.07975: done with get_vars() 40074 1727204611.07981: variable 'ansible_search_path' from source: unknown 40074 1727204611.07982: variable 'ansible_search_path' from source: unknown 40074 1727204611.08013: we have included files to process 40074 1727204611.08014: generating all_blocks data 40074 1727204611.08015: done generating all_blocks data 40074 1727204611.08016: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 40074 1727204611.08016: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 40074 1727204611.08018: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 40074 1727204611.08266: done processing included file 40074 1727204611.08267: iterating over new_blocks loaded from include file 40074 1727204611.08269: in VariableManager get_vars() 40074 1727204611.08282: done with get_vars() 40074 1727204611.08283: filtering new block on tags 40074 1727204611.08301: done filtering new block on tags 40074 1727204611.08303: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 40074 1727204611.08307: extending task lists for all hosts with included blocks 40074 1727204611.08384: done extending task lists 40074 1727204611.08385: done processing included files 40074 1727204611.08386: results queue empty 40074 1727204611.08386: checking for any_errors_fatal 40074 1727204611.08388: done checking for any_errors_fatal 40074 1727204611.08390: checking for max_fail_percentage 40074 1727204611.08391: done checking for max_fail_percentage 40074 1727204611.08392: checking to see if all hosts have failed and the running result is not ok 40074 1727204611.08392: done checking to see if all hosts have failed 40074 1727204611.08393: getting the remaining hosts for this loop 40074 1727204611.08394: done getting the remaining hosts for this loop 40074 1727204611.08396: getting the next task for host managed-node2 40074 1727204611.08399: done getting next task for host managed-node2 40074 1727204611.08401: ^ task is: TASK: Gather current interface info 40074 1727204611.08404: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204611.08407: getting variables 40074 1727204611.08408: in VariableManager get_vars() 40074 1727204611.08418: Calling all_inventory to load vars for managed-node2 40074 1727204611.08419: Calling groups_inventory to load vars for managed-node2 40074 1727204611.08421: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.08425: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.08427: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.08429: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.08564: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.08774: done with get_vars() 40074 1727204611.08781: done getting variables 40074 1727204611.08815: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.023) 0:00:04.850 ***** 40074 1727204611.08840: entering _queue_task() for managed-node2/command 40074 1727204611.09035: worker is 1 (out of 1 available) 40074 1727204611.09050: exiting _queue_task() for managed-node2/command 40074 1727204611.09063: done queuing things up, now waiting for results queue to drain 40074 1727204611.09065: waiting for pending results... 40074 1727204611.09215: running TaskExecutor() for managed-node2/TASK: Gather current interface info 40074 1727204611.09298: in run() - task 12b410aa-8751-9fd7-2501-00000000014e 40074 1727204611.09309: variable 'ansible_search_path' from source: unknown 40074 1727204611.09312: variable 'ansible_search_path' from source: unknown 40074 1727204611.09343: calling self._execute() 40074 1727204611.09416: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.09426: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.09432: variable 'omit' from source: magic vars 40074 1727204611.09736: variable 'ansible_distribution_major_version' from source: facts 40074 1727204611.09747: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204611.09759: variable 'omit' from source: magic vars 40074 1727204611.09797: variable 'omit' from source: magic vars 40074 1727204611.09829: variable 'omit' from source: magic vars 40074 1727204611.09867: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204611.09899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204611.09918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204611.09937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204611.09950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204611.09980: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204611.09983: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.09988: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.10080: Set connection var ansible_pipelining to False 40074 1727204611.10084: Set connection var ansible_shell_executable to /bin/sh 40074 1727204611.10086: Set connection var ansible_shell_type to sh 40074 1727204611.10092: Set connection var ansible_connection to ssh 40074 1727204611.10100: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204611.10106: Set connection var ansible_timeout to 10 40074 1727204611.10128: variable 'ansible_shell_executable' from source: unknown 40074 1727204611.10131: variable 'ansible_connection' from source: unknown 40074 1727204611.10138: variable 'ansible_module_compression' from source: unknown 40074 1727204611.10141: variable 'ansible_shell_type' from source: unknown 40074 1727204611.10145: variable 'ansible_shell_executable' from source: unknown 40074 1727204611.10149: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.10154: variable 'ansible_pipelining' from source: unknown 40074 1727204611.10158: variable 'ansible_timeout' from source: unknown 40074 1727204611.10164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.10280: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204611.10292: variable 'omit' from source: magic vars 40074 1727204611.10302: starting attempt loop 40074 1727204611.10305: running the handler 40074 1727204611.10319: _low_level_execute_command(): starting 40074 1727204611.10325: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204611.10883: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204611.10887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204611.10892: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204611.10895: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204611.10944: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204611.10948: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204611.11008: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204611.13417: stdout chunk (state=3): >>>/root <<< 40074 1727204611.13580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204611.13634: stderr chunk (state=3): >>><<< 40074 1727204611.13637: stdout chunk (state=3): >>><<< 40074 1727204611.13661: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204611.13677: _low_level_execute_command(): starting 40074 1727204611.13681: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140 `" && echo ansible-tmp-1727204611.1365929-40267-181389641688140="` echo /root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140 `" ) && sleep 0' 40074 1727204611.14148: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204611.14151: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204611.14154: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204611.14163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204611.14221: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204611.14224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204611.14260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204611.17085: stdout chunk (state=3): >>>ansible-tmp-1727204611.1365929-40267-181389641688140=/root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140 <<< 40074 1727204611.17267: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204611.17307: stderr chunk (state=3): >>><<< 40074 1727204611.17310: stdout chunk (state=3): >>><<< 40074 1727204611.17330: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204611.1365929-40267-181389641688140=/root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204611.17364: variable 'ansible_module_compression' from source: unknown 40074 1727204611.17408: ANSIBALLZ: Using generic lock for ansible.legacy.command 40074 1727204611.17411: ANSIBALLZ: Acquiring lock 40074 1727204611.17414: ANSIBALLZ: Lock acquired: 139809964199616 40074 1727204611.17416: ANSIBALLZ: Creating module 40074 1727204611.29283: ANSIBALLZ: Writing module into payload 40074 1727204611.29496: ANSIBALLZ: Writing module 40074 1727204611.29500: ANSIBALLZ: Renaming module 40074 1727204611.29503: ANSIBALLZ: Done creating module 40074 1727204611.29505: variable 'ansible_facts' from source: unknown 40074 1727204611.29544: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140/AnsiballZ_command.py 40074 1727204611.29918: Sending initial data 40074 1727204611.29921: Sent initial data (156 bytes) 40074 1727204611.30695: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204611.30700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204611.30712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204611.30775: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204611.30794: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204611.30863: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204611.33266: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204611.33313: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204611.33416: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpudlvltw9 /root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140/AnsiballZ_command.py <<< 40074 1727204611.33426: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140/AnsiballZ_command.py" <<< 40074 1727204611.33458: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpudlvltw9" to remote "/root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140/AnsiballZ_command.py" <<< 40074 1727204611.34717: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204611.34758: stderr chunk (state=3): >>><<< 40074 1727204611.34771: stdout chunk (state=3): >>><<< 40074 1727204611.34811: done transferring module to remote 40074 1727204611.34842: _low_level_execute_command(): starting 40074 1727204611.34853: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140/ /root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140/AnsiballZ_command.py && sleep 0' 40074 1727204611.35626: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204611.35750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204611.35807: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204611.35850: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204611.38420: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204611.38461: stderr chunk (state=3): >>><<< 40074 1727204611.38475: stdout chunk (state=3): >>><<< 40074 1727204611.38530: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204611.38534: _low_level_execute_command(): starting 40074 1727204611.38537: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140/AnsiballZ_command.py && sleep 0' 40074 1727204611.39206: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204611.39222: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204611.39239: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204611.39272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204611.39497: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204611.39511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204611.39698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204611.39714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204611.39922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204611.63439: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:31.628066", "end": "2024-09-24 15:03:31.633098", "delta": "0:00:00.005032", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204611.65912: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204611.65998: stderr chunk (state=3): >>><<< 40074 1727204611.66008: stdout chunk (state=3): >>><<< 40074 1727204611.66032: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:31.628066", "end": "2024-09-24 15:03:31.633098", "delta": "0:00:00.005032", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204611.66089: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204611.66115: _low_level_execute_command(): starting 40074 1727204611.66194: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204611.1365929-40267-181389641688140/ > /dev/null 2>&1 && sleep 0' 40074 1727204611.66791: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204611.66805: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204611.66848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204611.66862: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204611.66956: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204611.66983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204611.67003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204611.67023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204611.67101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204611.69901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204611.69939: stdout chunk (state=3): >>><<< 40074 1727204611.69942: stderr chunk (state=3): >>><<< 40074 1727204611.69995: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204611.69998: handler run complete 40074 1727204611.70009: Evaluated conditional (False): False 40074 1727204611.70042: attempt loop complete, returning result 40074 1727204611.70055: _execute() done 40074 1727204611.70064: dumping result to json 40074 1727204611.70126: done dumping result, returning 40074 1727204611.70129: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [12b410aa-8751-9fd7-2501-00000000014e] 40074 1727204611.70132: sending task result for task 12b410aa-8751-9fd7-2501-00000000014e ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.005032", "end": "2024-09-24 15:03:31.633098", "rc": 0, "start": "2024-09-24 15:03:31.628066" } STDOUT: bonding_masters eth0 lo rpltstbr 40074 1727204611.70347: no more pending results, returning what we have 40074 1727204611.70353: results queue empty 40074 1727204611.70355: checking for any_errors_fatal 40074 1727204611.70357: done checking for any_errors_fatal 40074 1727204611.70358: checking for max_fail_percentage 40074 1727204611.70359: done checking for max_fail_percentage 40074 1727204611.70360: checking to see if all hosts have failed and the running result is not ok 40074 1727204611.70363: done checking to see if all hosts have failed 40074 1727204611.70364: getting the remaining hosts for this loop 40074 1727204611.70366: done getting the remaining hosts for this loop 40074 1727204611.70371: getting the next task for host managed-node2 40074 1727204611.70379: done getting next task for host managed-node2 40074 1727204611.70383: ^ task is: TASK: Set current_interfaces 40074 1727204611.70388: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204611.70394: getting variables 40074 1727204611.70397: in VariableManager get_vars() 40074 1727204611.70445: Calling all_inventory to load vars for managed-node2 40074 1727204611.70449: Calling groups_inventory to load vars for managed-node2 40074 1727204611.70452: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.70465: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.70469: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.70474: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.71203: done sending task result for task 12b410aa-8751-9fd7-2501-00000000014e 40074 1727204611.71207: WORKER PROCESS EXITING 40074 1727204611.71232: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.71634: done with get_vars() 40074 1727204611.71648: done getting variables 40074 1727204611.71727: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.629) 0:00:05.479 ***** 40074 1727204611.71762: entering _queue_task() for managed-node2/set_fact 40074 1727204611.72097: worker is 1 (out of 1 available) 40074 1727204611.72117: exiting _queue_task() for managed-node2/set_fact 40074 1727204611.72139: done queuing things up, now waiting for results queue to drain 40074 1727204611.72141: waiting for pending results... 40074 1727204611.72366: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 40074 1727204611.72517: in run() - task 12b410aa-8751-9fd7-2501-00000000014f 40074 1727204611.72543: variable 'ansible_search_path' from source: unknown 40074 1727204611.72553: variable 'ansible_search_path' from source: unknown 40074 1727204611.72600: calling self._execute() 40074 1727204611.72708: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.72723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.72743: variable 'omit' from source: magic vars 40074 1727204611.73565: variable 'ansible_distribution_major_version' from source: facts 40074 1727204611.73585: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204611.73600: variable 'omit' from source: magic vars 40074 1727204611.73664: variable 'omit' from source: magic vars 40074 1727204611.73800: variable '_current_interfaces' from source: set_fact 40074 1727204611.73887: variable 'omit' from source: magic vars 40074 1727204611.73940: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204611.73990: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204611.74018: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204611.74049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204611.74068: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204611.74110: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204611.74119: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.74128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.74266: Set connection var ansible_pipelining to False 40074 1727204611.74280: Set connection var ansible_shell_executable to /bin/sh 40074 1727204611.74288: Set connection var ansible_shell_type to sh 40074 1727204611.74299: Set connection var ansible_connection to ssh 40074 1727204611.74313: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204611.74326: Set connection var ansible_timeout to 10 40074 1727204611.74365: variable 'ansible_shell_executable' from source: unknown 40074 1727204611.74495: variable 'ansible_connection' from source: unknown 40074 1727204611.74499: variable 'ansible_module_compression' from source: unknown 40074 1727204611.74501: variable 'ansible_shell_type' from source: unknown 40074 1727204611.74504: variable 'ansible_shell_executable' from source: unknown 40074 1727204611.74506: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.74509: variable 'ansible_pipelining' from source: unknown 40074 1727204611.74511: variable 'ansible_timeout' from source: unknown 40074 1727204611.74513: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.74598: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204611.74617: variable 'omit' from source: magic vars 40074 1727204611.74629: starting attempt loop 40074 1727204611.74644: running the handler 40074 1727204611.74664: handler run complete 40074 1727204611.74695: attempt loop complete, returning result 40074 1727204611.74699: _execute() done 40074 1727204611.74701: dumping result to json 40074 1727204611.74703: done dumping result, returning 40074 1727204611.74748: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [12b410aa-8751-9fd7-2501-00000000014f] 40074 1727204611.74751: sending task result for task 12b410aa-8751-9fd7-2501-00000000014f ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 40074 1727204611.74921: no more pending results, returning what we have 40074 1727204611.74925: results queue empty 40074 1727204611.74926: checking for any_errors_fatal 40074 1727204611.74935: done checking for any_errors_fatal 40074 1727204611.74936: checking for max_fail_percentage 40074 1727204611.74938: done checking for max_fail_percentage 40074 1727204611.74940: checking to see if all hosts have failed and the running result is not ok 40074 1727204611.74941: done checking to see if all hosts have failed 40074 1727204611.74942: getting the remaining hosts for this loop 40074 1727204611.74944: done getting the remaining hosts for this loop 40074 1727204611.74949: getting the next task for host managed-node2 40074 1727204611.74959: done getting next task for host managed-node2 40074 1727204611.74962: ^ task is: TASK: Show current_interfaces 40074 1727204611.74966: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204611.74970: getting variables 40074 1727204611.74972: in VariableManager get_vars() 40074 1727204611.75023: Calling all_inventory to load vars for managed-node2 40074 1727204611.75027: Calling groups_inventory to load vars for managed-node2 40074 1727204611.75033: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.75047: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.75050: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.75055: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.75785: done sending task result for task 12b410aa-8751-9fd7-2501-00000000014f 40074 1727204611.75790: WORKER PROCESS EXITING 40074 1727204611.75949: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.76301: done with get_vars() 40074 1727204611.76313: done getting variables 40074 1727204611.76423: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.046) 0:00:05.526 ***** 40074 1727204611.76457: entering _queue_task() for managed-node2/debug 40074 1727204611.76459: Creating lock for debug 40074 1727204611.76763: worker is 1 (out of 1 available) 40074 1727204611.76778: exiting _queue_task() for managed-node2/debug 40074 1727204611.76895: done queuing things up, now waiting for results queue to drain 40074 1727204611.76898: waiting for pending results... 40074 1727204611.77073: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 40074 1727204611.77202: in run() - task 12b410aa-8751-9fd7-2501-000000000136 40074 1727204611.77226: variable 'ansible_search_path' from source: unknown 40074 1727204611.77241: variable 'ansible_search_path' from source: unknown 40074 1727204611.77283: calling self._execute() 40074 1727204611.77391: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.77405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.77420: variable 'omit' from source: magic vars 40074 1727204611.77874: variable 'ansible_distribution_major_version' from source: facts 40074 1727204611.77899: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204611.77911: variable 'omit' from source: magic vars 40074 1727204611.77966: variable 'omit' from source: magic vars 40074 1727204611.78095: variable 'current_interfaces' from source: set_fact 40074 1727204611.78139: variable 'omit' from source: magic vars 40074 1727204611.78194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204611.78322: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204611.78326: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204611.78329: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204611.78334: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204611.78357: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204611.78367: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.78377: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.78517: Set connection var ansible_pipelining to False 40074 1727204611.78532: Set connection var ansible_shell_executable to /bin/sh 40074 1727204611.78541: Set connection var ansible_shell_type to sh 40074 1727204611.78549: Set connection var ansible_connection to ssh 40074 1727204611.78567: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204611.78579: Set connection var ansible_timeout to 10 40074 1727204611.78666: variable 'ansible_shell_executable' from source: unknown 40074 1727204611.78670: variable 'ansible_connection' from source: unknown 40074 1727204611.78673: variable 'ansible_module_compression' from source: unknown 40074 1727204611.78675: variable 'ansible_shell_type' from source: unknown 40074 1727204611.78679: variable 'ansible_shell_executable' from source: unknown 40074 1727204611.78681: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.78683: variable 'ansible_pipelining' from source: unknown 40074 1727204611.78686: variable 'ansible_timeout' from source: unknown 40074 1727204611.78688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.78843: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204611.78861: variable 'omit' from source: magic vars 40074 1727204611.78872: starting attempt loop 40074 1727204611.78884: running the handler 40074 1727204611.78944: handler run complete 40074 1727204611.78991: attempt loop complete, returning result 40074 1727204611.78994: _execute() done 40074 1727204611.78998: dumping result to json 40074 1727204611.79001: done dumping result, returning 40074 1727204611.79004: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [12b410aa-8751-9fd7-2501-000000000136] 40074 1727204611.79011: sending task result for task 12b410aa-8751-9fd7-2501-000000000136 40074 1727204611.79168: done sending task result for task 12b410aa-8751-9fd7-2501-000000000136 40074 1727204611.79171: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 40074 1727204611.79259: no more pending results, returning what we have 40074 1727204611.79263: results queue empty 40074 1727204611.79264: checking for any_errors_fatal 40074 1727204611.79271: done checking for any_errors_fatal 40074 1727204611.79272: checking for max_fail_percentage 40074 1727204611.79274: done checking for max_fail_percentage 40074 1727204611.79275: checking to see if all hosts have failed and the running result is not ok 40074 1727204611.79276: done checking to see if all hosts have failed 40074 1727204611.79277: getting the remaining hosts for this loop 40074 1727204611.79279: done getting the remaining hosts for this loop 40074 1727204611.79284: getting the next task for host managed-node2 40074 1727204611.79295: done getting next task for host managed-node2 40074 1727204611.79299: ^ task is: TASK: Manage test interface 40074 1727204611.79301: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204611.79306: getting variables 40074 1727204611.79308: in VariableManager get_vars() 40074 1727204611.79356: Calling all_inventory to load vars for managed-node2 40074 1727204611.79360: Calling groups_inventory to load vars for managed-node2 40074 1727204611.79364: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.79376: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.79380: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.79385: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.79874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.80276: done with get_vars() 40074 1727204611.80288: done getting variables TASK [Manage test interface] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:17 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.039) 0:00:05.565 ***** 40074 1727204611.80400: entering _queue_task() for managed-node2/include_tasks 40074 1727204611.80675: worker is 1 (out of 1 available) 40074 1727204611.80692: exiting _queue_task() for managed-node2/include_tasks 40074 1727204611.80705: done queuing things up, now waiting for results queue to drain 40074 1727204611.80706: waiting for pending results... 40074 1727204611.81108: running TaskExecutor() for managed-node2/TASK: Manage test interface 40074 1727204611.81113: in run() - task 12b410aa-8751-9fd7-2501-00000000000d 40074 1727204611.81117: variable 'ansible_search_path' from source: unknown 40074 1727204611.81154: calling self._execute() 40074 1727204611.81262: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.81276: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.81295: variable 'omit' from source: magic vars 40074 1727204611.81770: variable 'ansible_distribution_major_version' from source: facts 40074 1727204611.81790: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204611.81802: _execute() done 40074 1727204611.81812: dumping result to json 40074 1727204611.81820: done dumping result, returning 40074 1727204611.81835: done running TaskExecutor() for managed-node2/TASK: Manage test interface [12b410aa-8751-9fd7-2501-00000000000d] 40074 1727204611.81847: sending task result for task 12b410aa-8751-9fd7-2501-00000000000d 40074 1727204611.81987: no more pending results, returning what we have 40074 1727204611.81994: in VariableManager get_vars() 40074 1727204611.82045: Calling all_inventory to load vars for managed-node2 40074 1727204611.82049: Calling groups_inventory to load vars for managed-node2 40074 1727204611.82052: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.82068: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.82072: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.82076: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.82583: done sending task result for task 12b410aa-8751-9fd7-2501-00000000000d 40074 1727204611.82587: WORKER PROCESS EXITING 40074 1727204611.82617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.82981: done with get_vars() 40074 1727204611.82992: variable 'ansible_search_path' from source: unknown 40074 1727204611.83006: we have included files to process 40074 1727204611.83007: generating all_blocks data 40074 1727204611.83009: done generating all_blocks data 40074 1727204611.83015: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 40074 1727204611.83016: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 40074 1727204611.83019: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 40074 1727204611.83780: in VariableManager get_vars() 40074 1727204611.83812: done with get_vars() 40074 1727204611.84097: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 40074 1727204611.84868: done processing included file 40074 1727204611.84870: iterating over new_blocks loaded from include file 40074 1727204611.84872: in VariableManager get_vars() 40074 1727204611.84900: done with get_vars() 40074 1727204611.84903: filtering new block on tags 40074 1727204611.84951: done filtering new block on tags 40074 1727204611.84954: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node2 40074 1727204611.84975: extending task lists for all hosts with included blocks 40074 1727204611.85319: done extending task lists 40074 1727204611.85321: done processing included files 40074 1727204611.85322: results queue empty 40074 1727204611.85323: checking for any_errors_fatal 40074 1727204611.85326: done checking for any_errors_fatal 40074 1727204611.85327: checking for max_fail_percentage 40074 1727204611.85329: done checking for max_fail_percentage 40074 1727204611.85329: checking to see if all hosts have failed and the running result is not ok 40074 1727204611.85333: done checking to see if all hosts have failed 40074 1727204611.85334: getting the remaining hosts for this loop 40074 1727204611.85336: done getting the remaining hosts for this loop 40074 1727204611.85339: getting the next task for host managed-node2 40074 1727204611.85344: done getting next task for host managed-node2 40074 1727204611.85346: ^ task is: TASK: Ensure state in ["present", "absent"] 40074 1727204611.85350: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204611.85352: getting variables 40074 1727204611.85354: in VariableManager get_vars() 40074 1727204611.85370: Calling all_inventory to load vars for managed-node2 40074 1727204611.85373: Calling groups_inventory to load vars for managed-node2 40074 1727204611.85376: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.85383: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.85386: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.85392: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.85645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.86005: done with get_vars() 40074 1727204611.86018: done getting variables 40074 1727204611.86097: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.057) 0:00:05.623 ***** 40074 1727204611.86128: entering _queue_task() for managed-node2/fail 40074 1727204611.86133: Creating lock for fail 40074 1727204611.86464: worker is 1 (out of 1 available) 40074 1727204611.86478: exiting _queue_task() for managed-node2/fail 40074 1727204611.86597: done queuing things up, now waiting for results queue to drain 40074 1727204611.86599: waiting for pending results... 40074 1727204611.86783: running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] 40074 1727204611.86910: in run() - task 12b410aa-8751-9fd7-2501-00000000016a 40074 1727204611.86938: variable 'ansible_search_path' from source: unknown 40074 1727204611.86947: variable 'ansible_search_path' from source: unknown 40074 1727204611.86993: calling self._execute() 40074 1727204611.87101: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.87117: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.87135: variable 'omit' from source: magic vars 40074 1727204611.87607: variable 'ansible_distribution_major_version' from source: facts 40074 1727204611.87625: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204611.87801: variable 'state' from source: include params 40074 1727204611.87813: Evaluated conditional (state not in ["present", "absent"]): False 40074 1727204611.87821: when evaluation is False, skipping this task 40074 1727204611.87858: _execute() done 40074 1727204611.87861: dumping result to json 40074 1727204611.87863: done dumping result, returning 40074 1727204611.87866: done running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] [12b410aa-8751-9fd7-2501-00000000016a] 40074 1727204611.87868: sending task result for task 12b410aa-8751-9fd7-2501-00000000016a skipping: [managed-node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 40074 1727204611.88021: no more pending results, returning what we have 40074 1727204611.88025: results queue empty 40074 1727204611.88027: checking for any_errors_fatal 40074 1727204611.88028: done checking for any_errors_fatal 40074 1727204611.88029: checking for max_fail_percentage 40074 1727204611.88033: done checking for max_fail_percentage 40074 1727204611.88034: checking to see if all hosts have failed and the running result is not ok 40074 1727204611.88036: done checking to see if all hosts have failed 40074 1727204611.88037: getting the remaining hosts for this loop 40074 1727204611.88038: done getting the remaining hosts for this loop 40074 1727204611.88042: getting the next task for host managed-node2 40074 1727204611.88049: done getting next task for host managed-node2 40074 1727204611.88052: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 40074 1727204611.88057: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204611.88060: getting variables 40074 1727204611.88062: in VariableManager get_vars() 40074 1727204611.88108: Calling all_inventory to load vars for managed-node2 40074 1727204611.88112: Calling groups_inventory to load vars for managed-node2 40074 1727204611.88115: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.88130: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.88136: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.88140: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.88683: done sending task result for task 12b410aa-8751-9fd7-2501-00000000016a 40074 1727204611.88687: WORKER PROCESS EXITING 40074 1727204611.88714: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.89053: done with get_vars() 40074 1727204611.89065: done getting variables 40074 1727204611.89127: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.030) 0:00:05.653 ***** 40074 1727204611.89159: entering _queue_task() for managed-node2/fail 40074 1727204611.89614: worker is 1 (out of 1 available) 40074 1727204611.89624: exiting _queue_task() for managed-node2/fail 40074 1727204611.89637: done queuing things up, now waiting for results queue to drain 40074 1727204611.89640: waiting for pending results... 40074 1727204611.89719: running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] 40074 1727204611.89851: in run() - task 12b410aa-8751-9fd7-2501-00000000016b 40074 1727204611.89877: variable 'ansible_search_path' from source: unknown 40074 1727204611.89885: variable 'ansible_search_path' from source: unknown 40074 1727204611.89933: calling self._execute() 40074 1727204611.90082: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.90085: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.90090: variable 'omit' from source: magic vars 40074 1727204611.90649: variable 'ansible_distribution_major_version' from source: facts 40074 1727204611.90668: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204611.90871: variable 'type' from source: set_fact 40074 1727204611.90883: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 40074 1727204611.90951: when evaluation is False, skipping this task 40074 1727204611.90955: _execute() done 40074 1727204611.90958: dumping result to json 40074 1727204611.90961: done dumping result, returning 40074 1727204611.90964: done running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] [12b410aa-8751-9fd7-2501-00000000016b] 40074 1727204611.90966: sending task result for task 12b410aa-8751-9fd7-2501-00000000016b skipping: [managed-node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 40074 1727204611.91108: no more pending results, returning what we have 40074 1727204611.91113: results queue empty 40074 1727204611.91114: checking for any_errors_fatal 40074 1727204611.91120: done checking for any_errors_fatal 40074 1727204611.91121: checking for max_fail_percentage 40074 1727204611.91123: done checking for max_fail_percentage 40074 1727204611.91124: checking to see if all hosts have failed and the running result is not ok 40074 1727204611.91126: done checking to see if all hosts have failed 40074 1727204611.91127: getting the remaining hosts for this loop 40074 1727204611.91128: done getting the remaining hosts for this loop 40074 1727204611.91135: getting the next task for host managed-node2 40074 1727204611.91142: done getting next task for host managed-node2 40074 1727204611.91146: ^ task is: TASK: Include the task 'show_interfaces.yml' 40074 1727204611.91150: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204611.91155: getting variables 40074 1727204611.91157: in VariableManager get_vars() 40074 1727204611.91308: Calling all_inventory to load vars for managed-node2 40074 1727204611.91312: Calling groups_inventory to load vars for managed-node2 40074 1727204611.91315: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.91333: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.91337: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.91341: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.91764: done sending task result for task 12b410aa-8751-9fd7-2501-00000000016b 40074 1727204611.91768: WORKER PROCESS EXITING 40074 1727204611.91797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.92187: done with get_vars() 40074 1727204611.92201: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.031) 0:00:05.685 ***** 40074 1727204611.92327: entering _queue_task() for managed-node2/include_tasks 40074 1727204611.93202: worker is 1 (out of 1 available) 40074 1727204611.93216: exiting _queue_task() for managed-node2/include_tasks 40074 1727204611.93234: done queuing things up, now waiting for results queue to drain 40074 1727204611.93236: waiting for pending results... 40074 1727204611.93774: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 40074 1727204611.94007: in run() - task 12b410aa-8751-9fd7-2501-00000000016c 40074 1727204611.94037: variable 'ansible_search_path' from source: unknown 40074 1727204611.94046: variable 'ansible_search_path' from source: unknown 40074 1727204611.94109: calling self._execute() 40074 1727204611.94327: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.94372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.94444: variable 'omit' from source: magic vars 40074 1727204611.95025: variable 'ansible_distribution_major_version' from source: facts 40074 1727204611.95049: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204611.95064: _execute() done 40074 1727204611.95081: dumping result to json 40074 1727204611.95096: done dumping result, returning 40074 1727204611.95185: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-9fd7-2501-00000000016c] 40074 1727204611.95192: sending task result for task 12b410aa-8751-9fd7-2501-00000000016c 40074 1727204611.95270: done sending task result for task 12b410aa-8751-9fd7-2501-00000000016c 40074 1727204611.95274: WORKER PROCESS EXITING 40074 1727204611.95322: no more pending results, returning what we have 40074 1727204611.95327: in VariableManager get_vars() 40074 1727204611.95383: Calling all_inventory to load vars for managed-node2 40074 1727204611.95388: Calling groups_inventory to load vars for managed-node2 40074 1727204611.95393: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.95410: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.95414: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.95418: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.95906: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.96274: done with get_vars() 40074 1727204611.96284: variable 'ansible_search_path' from source: unknown 40074 1727204611.96285: variable 'ansible_search_path' from source: unknown 40074 1727204611.96359: we have included files to process 40074 1727204611.96361: generating all_blocks data 40074 1727204611.96364: done generating all_blocks data 40074 1727204611.96370: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 40074 1727204611.96372: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 40074 1727204611.96375: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 40074 1727204611.96547: in VariableManager get_vars() 40074 1727204611.96636: done with get_vars() 40074 1727204611.96930: done processing included file 40074 1727204611.96935: iterating over new_blocks loaded from include file 40074 1727204611.96937: in VariableManager get_vars() 40074 1727204611.96962: done with get_vars() 40074 1727204611.96964: filtering new block on tags 40074 1727204611.96988: done filtering new block on tags 40074 1727204611.96994: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 40074 1727204611.97001: extending task lists for all hosts with included blocks 40074 1727204611.97884: done extending task lists 40074 1727204611.97886: done processing included files 40074 1727204611.97887: results queue empty 40074 1727204611.97888: checking for any_errors_fatal 40074 1727204611.97894: done checking for any_errors_fatal 40074 1727204611.97895: checking for max_fail_percentage 40074 1727204611.97897: done checking for max_fail_percentage 40074 1727204611.97898: checking to see if all hosts have failed and the running result is not ok 40074 1727204611.97899: done checking to see if all hosts have failed 40074 1727204611.97900: getting the remaining hosts for this loop 40074 1727204611.97901: done getting the remaining hosts for this loop 40074 1727204611.97905: getting the next task for host managed-node2 40074 1727204611.97911: done getting next task for host managed-node2 40074 1727204611.97913: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 40074 1727204611.97917: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204611.97920: getting variables 40074 1727204611.97921: in VariableManager get_vars() 40074 1727204611.97940: Calling all_inventory to load vars for managed-node2 40074 1727204611.97943: Calling groups_inventory to load vars for managed-node2 40074 1727204611.97946: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204611.97953: Calling all_plugins_play to load vars for managed-node2 40074 1727204611.97956: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204611.97960: Calling groups_plugins_play to load vars for managed-node2 40074 1727204611.98306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204611.98664: done with get_vars() 40074 1727204611.98676: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:31 -0400 (0:00:00.064) 0:00:05.749 ***** 40074 1727204611.98776: entering _queue_task() for managed-node2/include_tasks 40074 1727204611.99114: worker is 1 (out of 1 available) 40074 1727204611.99128: exiting _queue_task() for managed-node2/include_tasks 40074 1727204611.99144: done queuing things up, now waiting for results queue to drain 40074 1727204611.99146: waiting for pending results... 40074 1727204611.99433: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 40074 1727204611.99512: in run() - task 12b410aa-8751-9fd7-2501-00000000019d 40074 1727204611.99526: variable 'ansible_search_path' from source: unknown 40074 1727204611.99531: variable 'ansible_search_path' from source: unknown 40074 1727204611.99563: calling self._execute() 40074 1727204611.99645: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204611.99651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204611.99661: variable 'omit' from source: magic vars 40074 1727204611.99985: variable 'ansible_distribution_major_version' from source: facts 40074 1727204611.99999: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204612.00005: _execute() done 40074 1727204612.00010: dumping result to json 40074 1727204612.00013: done dumping result, returning 40074 1727204612.00020: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-9fd7-2501-00000000019d] 40074 1727204612.00025: sending task result for task 12b410aa-8751-9fd7-2501-00000000019d 40074 1727204612.00123: done sending task result for task 12b410aa-8751-9fd7-2501-00000000019d 40074 1727204612.00126: WORKER PROCESS EXITING 40074 1727204612.00181: no more pending results, returning what we have 40074 1727204612.00185: in VariableManager get_vars() 40074 1727204612.00229: Calling all_inventory to load vars for managed-node2 40074 1727204612.00232: Calling groups_inventory to load vars for managed-node2 40074 1727204612.00235: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204612.00245: Calling all_plugins_play to load vars for managed-node2 40074 1727204612.00249: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204612.00252: Calling groups_plugins_play to load vars for managed-node2 40074 1727204612.00601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204612.00982: done with get_vars() 40074 1727204612.00995: variable 'ansible_search_path' from source: unknown 40074 1727204612.00996: variable 'ansible_search_path' from source: unknown 40074 1727204612.01080: we have included files to process 40074 1727204612.01082: generating all_blocks data 40074 1727204612.01084: done generating all_blocks data 40074 1727204612.01085: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 40074 1727204612.01086: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 40074 1727204612.01091: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 40074 1727204612.01450: done processing included file 40074 1727204612.01453: iterating over new_blocks loaded from include file 40074 1727204612.01455: in VariableManager get_vars() 40074 1727204612.01493: done with get_vars() 40074 1727204612.01495: filtering new block on tags 40074 1727204612.01520: done filtering new block on tags 40074 1727204612.01523: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 40074 1727204612.01529: extending task lists for all hosts with included blocks 40074 1727204612.01758: done extending task lists 40074 1727204612.01760: done processing included files 40074 1727204612.01761: results queue empty 40074 1727204612.01762: checking for any_errors_fatal 40074 1727204612.01765: done checking for any_errors_fatal 40074 1727204612.01766: checking for max_fail_percentage 40074 1727204612.01768: done checking for max_fail_percentage 40074 1727204612.01769: checking to see if all hosts have failed and the running result is not ok 40074 1727204612.01770: done checking to see if all hosts have failed 40074 1727204612.01771: getting the remaining hosts for this loop 40074 1727204612.01772: done getting the remaining hosts for this loop 40074 1727204612.01775: getting the next task for host managed-node2 40074 1727204612.01780: done getting next task for host managed-node2 40074 1727204612.01783: ^ task is: TASK: Gather current interface info 40074 1727204612.01787: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204612.01792: getting variables 40074 1727204612.01793: in VariableManager get_vars() 40074 1727204612.01820: Calling all_inventory to load vars for managed-node2 40074 1727204612.01823: Calling groups_inventory to load vars for managed-node2 40074 1727204612.01826: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204612.01832: Calling all_plugins_play to load vars for managed-node2 40074 1727204612.01836: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204612.01840: Calling groups_plugins_play to load vars for managed-node2 40074 1727204612.02103: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204612.02531: done with get_vars() 40074 1727204612.02544: done getting variables 40074 1727204612.02611: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.038) 0:00:05.788 ***** 40074 1727204612.02647: entering _queue_task() for managed-node2/command 40074 1727204612.03046: worker is 1 (out of 1 available) 40074 1727204612.03062: exiting _queue_task() for managed-node2/command 40074 1727204612.03077: done queuing things up, now waiting for results queue to drain 40074 1727204612.03078: waiting for pending results... 40074 1727204612.03308: running TaskExecutor() for managed-node2/TASK: Gather current interface info 40074 1727204612.03596: in run() - task 12b410aa-8751-9fd7-2501-0000000001d4 40074 1727204612.03599: variable 'ansible_search_path' from source: unknown 40074 1727204612.03602: variable 'ansible_search_path' from source: unknown 40074 1727204612.03605: calling self._execute() 40074 1727204612.03607: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204612.03609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204612.03611: variable 'omit' from source: magic vars 40074 1727204612.04036: variable 'ansible_distribution_major_version' from source: facts 40074 1727204612.04055: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204612.04065: variable 'omit' from source: magic vars 40074 1727204612.04143: variable 'omit' from source: magic vars 40074 1727204612.04192: variable 'omit' from source: magic vars 40074 1727204612.04246: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204612.04296: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204612.04327: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204612.04358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204612.04379: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204612.04420: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204612.04433: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204612.04445: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204612.04591: Set connection var ansible_pipelining to False 40074 1727204612.04606: Set connection var ansible_shell_executable to /bin/sh 40074 1727204612.04613: Set connection var ansible_shell_type to sh 40074 1727204612.04620: Set connection var ansible_connection to ssh 40074 1727204612.04634: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204612.04646: Set connection var ansible_timeout to 10 40074 1727204612.04685: variable 'ansible_shell_executable' from source: unknown 40074 1727204612.04697: variable 'ansible_connection' from source: unknown 40074 1727204612.04706: variable 'ansible_module_compression' from source: unknown 40074 1727204612.04713: variable 'ansible_shell_type' from source: unknown 40074 1727204612.04766: variable 'ansible_shell_executable' from source: unknown 40074 1727204612.04769: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204612.04772: variable 'ansible_pipelining' from source: unknown 40074 1727204612.04774: variable 'ansible_timeout' from source: unknown 40074 1727204612.04776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204612.04936: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204612.04957: variable 'omit' from source: magic vars 40074 1727204612.04968: starting attempt loop 40074 1727204612.04978: running the handler 40074 1727204612.05008: _low_level_execute_command(): starting 40074 1727204612.05020: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204612.06028: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204612.06051: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204612.06091: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204612.06117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204612.06162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204612.06196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204612.08237: stdout chunk (state=3): >>>/root <<< 40074 1727204612.08329: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204612.08360: stderr chunk (state=3): >>><<< 40074 1727204612.08377: stdout chunk (state=3): >>><<< 40074 1727204612.08413: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204612.08446: _low_level_execute_command(): starting 40074 1727204612.08794: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390 `" && echo ansible-tmp-1727204612.0842843-40307-157214837297390="` echo /root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390 `" ) && sleep 0' 40074 1727204612.10018: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204612.10088: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204612.10250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204612.10302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204612.10421: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204612.12559: stdout chunk (state=3): >>>ansible-tmp-1727204612.0842843-40307-157214837297390=/root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390 <<< 40074 1727204612.12710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204612.12872: stderr chunk (state=3): >>><<< 40074 1727204612.12882: stdout chunk (state=3): >>><<< 40074 1727204612.12933: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204612.0842843-40307-157214837297390=/root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204612.13003: variable 'ansible_module_compression' from source: unknown 40074 1727204612.13192: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204612.13302: variable 'ansible_facts' from source: unknown 40074 1727204612.13464: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390/AnsiballZ_command.py 40074 1727204612.13858: Sending initial data 40074 1727204612.13862: Sent initial data (156 bytes) 40074 1727204612.15243: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204612.15453: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204612.15527: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204612.17323: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 40074 1727204612.17418: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204612.17465: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204612.17505: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp3e3z5zeo /root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390/AnsiballZ_command.py <<< 40074 1727204612.17527: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390/AnsiballZ_command.py" <<< 40074 1727204612.17540: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp3e3z5zeo" to remote "/root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390/AnsiballZ_command.py" <<< 40074 1727204612.17638: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390/AnsiballZ_command.py" <<< 40074 1727204612.19392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204612.19494: stderr chunk (state=3): >>><<< 40074 1727204612.19498: stdout chunk (state=3): >>><<< 40074 1727204612.19525: done transferring module to remote 40074 1727204612.19541: _low_level_execute_command(): starting 40074 1727204612.19547: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390/ /root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390/AnsiballZ_command.py && sleep 0' 40074 1727204612.20880: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204612.21109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204612.21225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204612.23341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204612.23345: stdout chunk (state=3): >>><<< 40074 1727204612.23354: stderr chunk (state=3): >>><<< 40074 1727204612.23374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204612.23381: _low_level_execute_command(): starting 40074 1727204612.23384: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390/AnsiballZ_command.py && sleep 0' 40074 1727204612.24572: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204612.24577: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204612.24580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204612.24583: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204612.24585: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204612.24588: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 40074 1727204612.24594: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204612.24600: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204612.24615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204612.24647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204612.24778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204612.25101: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204612.25105: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204612.43154: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:32.426885", "end": "2024-09-24 15:03:32.430565", "delta": "0:00:00.003680", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204612.44841: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204612.44900: stderr chunk (state=3): >>><<< 40074 1727204612.44904: stdout chunk (state=3): >>><<< 40074 1727204612.44928: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:32.426885", "end": "2024-09-24 15:03:32.430565", "delta": "0:00:00.003680", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204612.44968: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204612.44976: _low_level_execute_command(): starting 40074 1727204612.44982: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204612.0842843-40307-157214837297390/ > /dev/null 2>&1 && sleep 0' 40074 1727204612.45469: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204612.45473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204612.45476: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204612.45478: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204612.45480: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204612.45535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204612.45540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204612.45586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204612.47547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204612.47594: stderr chunk (state=3): >>><<< 40074 1727204612.47597: stdout chunk (state=3): >>><<< 40074 1727204612.47612: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204612.47620: handler run complete 40074 1727204612.47645: Evaluated conditional (False): False 40074 1727204612.47657: attempt loop complete, returning result 40074 1727204612.47660: _execute() done 40074 1727204612.47667: dumping result to json 40074 1727204612.47675: done dumping result, returning 40074 1727204612.47683: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [12b410aa-8751-9fd7-2501-0000000001d4] 40074 1727204612.47688: sending task result for task 12b410aa-8751-9fd7-2501-0000000001d4 40074 1727204612.47796: done sending task result for task 12b410aa-8751-9fd7-2501-0000000001d4 40074 1727204612.47799: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003680", "end": "2024-09-24 15:03:32.430565", "rc": 0, "start": "2024-09-24 15:03:32.426885" } STDOUT: bonding_masters eth0 lo rpltstbr 40074 1727204612.47885: no more pending results, returning what we have 40074 1727204612.47888: results queue empty 40074 1727204612.47891: checking for any_errors_fatal 40074 1727204612.47894: done checking for any_errors_fatal 40074 1727204612.47895: checking for max_fail_percentage 40074 1727204612.47896: done checking for max_fail_percentage 40074 1727204612.47897: checking to see if all hosts have failed and the running result is not ok 40074 1727204612.47899: done checking to see if all hosts have failed 40074 1727204612.47899: getting the remaining hosts for this loop 40074 1727204612.47901: done getting the remaining hosts for this loop 40074 1727204612.47905: getting the next task for host managed-node2 40074 1727204612.47912: done getting next task for host managed-node2 40074 1727204612.47915: ^ task is: TASK: Set current_interfaces 40074 1727204612.47921: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204612.47925: getting variables 40074 1727204612.47927: in VariableManager get_vars() 40074 1727204612.47966: Calling all_inventory to load vars for managed-node2 40074 1727204612.47969: Calling groups_inventory to load vars for managed-node2 40074 1727204612.47972: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204612.47982: Calling all_plugins_play to load vars for managed-node2 40074 1727204612.47986: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204612.47996: Calling groups_plugins_play to load vars for managed-node2 40074 1727204612.48179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204612.48383: done with get_vars() 40074 1727204612.48395: done getting variables 40074 1727204612.48449: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.458) 0:00:06.246 ***** 40074 1727204612.48475: entering _queue_task() for managed-node2/set_fact 40074 1727204612.48686: worker is 1 (out of 1 available) 40074 1727204612.48703: exiting _queue_task() for managed-node2/set_fact 40074 1727204612.48716: done queuing things up, now waiting for results queue to drain 40074 1727204612.48718: waiting for pending results... 40074 1727204612.48886: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 40074 1727204612.48973: in run() - task 12b410aa-8751-9fd7-2501-0000000001d5 40074 1727204612.48984: variable 'ansible_search_path' from source: unknown 40074 1727204612.48988: variable 'ansible_search_path' from source: unknown 40074 1727204612.49023: calling self._execute() 40074 1727204612.49103: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204612.49110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204612.49119: variable 'omit' from source: magic vars 40074 1727204612.49483: variable 'ansible_distribution_major_version' from source: facts 40074 1727204612.49498: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204612.49505: variable 'omit' from source: magic vars 40074 1727204612.49553: variable 'omit' from source: magic vars 40074 1727204612.49647: variable '_current_interfaces' from source: set_fact 40074 1727204612.49699: variable 'omit' from source: magic vars 40074 1727204612.49739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204612.49770: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204612.49788: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204612.49806: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204612.49819: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204612.49850: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204612.49853: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204612.49857: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204612.49945: Set connection var ansible_pipelining to False 40074 1727204612.49955: Set connection var ansible_shell_executable to /bin/sh 40074 1727204612.49958: Set connection var ansible_shell_type to sh 40074 1727204612.49962: Set connection var ansible_connection to ssh 40074 1727204612.49970: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204612.49976: Set connection var ansible_timeout to 10 40074 1727204612.50002: variable 'ansible_shell_executable' from source: unknown 40074 1727204612.50005: variable 'ansible_connection' from source: unknown 40074 1727204612.50008: variable 'ansible_module_compression' from source: unknown 40074 1727204612.50010: variable 'ansible_shell_type' from source: unknown 40074 1727204612.50015: variable 'ansible_shell_executable' from source: unknown 40074 1727204612.50018: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204612.50024: variable 'ansible_pipelining' from source: unknown 40074 1727204612.50027: variable 'ansible_timeout' from source: unknown 40074 1727204612.50037: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204612.50158: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204612.50168: variable 'omit' from source: magic vars 40074 1727204612.50174: starting attempt loop 40074 1727204612.50177: running the handler 40074 1727204612.50190: handler run complete 40074 1727204612.50199: attempt loop complete, returning result 40074 1727204612.50202: _execute() done 40074 1727204612.50206: dumping result to json 40074 1727204612.50211: done dumping result, returning 40074 1727204612.50219: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [12b410aa-8751-9fd7-2501-0000000001d5] 40074 1727204612.50223: sending task result for task 12b410aa-8751-9fd7-2501-0000000001d5 40074 1727204612.50311: done sending task result for task 12b410aa-8751-9fd7-2501-0000000001d5 40074 1727204612.50315: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo", "rpltstbr" ] }, "changed": false } 40074 1727204612.50380: no more pending results, returning what we have 40074 1727204612.50383: results queue empty 40074 1727204612.50384: checking for any_errors_fatal 40074 1727204612.50392: done checking for any_errors_fatal 40074 1727204612.50393: checking for max_fail_percentage 40074 1727204612.50395: done checking for max_fail_percentage 40074 1727204612.50396: checking to see if all hosts have failed and the running result is not ok 40074 1727204612.50397: done checking to see if all hosts have failed 40074 1727204612.50398: getting the remaining hosts for this loop 40074 1727204612.50399: done getting the remaining hosts for this loop 40074 1727204612.50404: getting the next task for host managed-node2 40074 1727204612.50412: done getting next task for host managed-node2 40074 1727204612.50414: ^ task is: TASK: Show current_interfaces 40074 1727204612.50418: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204612.50422: getting variables 40074 1727204612.50423: in VariableManager get_vars() 40074 1727204612.50460: Calling all_inventory to load vars for managed-node2 40074 1727204612.50462: Calling groups_inventory to load vars for managed-node2 40074 1727204612.50465: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204612.50475: Calling all_plugins_play to load vars for managed-node2 40074 1727204612.50477: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204612.50481: Calling groups_plugins_play to load vars for managed-node2 40074 1727204612.50698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204612.50897: done with get_vars() 40074 1727204612.50905: done getting variables 40074 1727204612.50955: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.025) 0:00:06.271 ***** 40074 1727204612.50979: entering _queue_task() for managed-node2/debug 40074 1727204612.51176: worker is 1 (out of 1 available) 40074 1727204612.51192: exiting _queue_task() for managed-node2/debug 40074 1727204612.51205: done queuing things up, now waiting for results queue to drain 40074 1727204612.51206: waiting for pending results... 40074 1727204612.51372: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 40074 1727204612.51453: in run() - task 12b410aa-8751-9fd7-2501-00000000019e 40074 1727204612.51465: variable 'ansible_search_path' from source: unknown 40074 1727204612.51469: variable 'ansible_search_path' from source: unknown 40074 1727204612.51503: calling self._execute() 40074 1727204612.51586: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204612.51594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204612.51604: variable 'omit' from source: magic vars 40074 1727204612.51912: variable 'ansible_distribution_major_version' from source: facts 40074 1727204612.51923: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204612.51929: variable 'omit' from source: magic vars 40074 1727204612.51970: variable 'omit' from source: magic vars 40074 1727204612.52055: variable 'current_interfaces' from source: set_fact 40074 1727204612.52078: variable 'omit' from source: magic vars 40074 1727204612.52117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204612.52149: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204612.52167: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204612.52183: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204612.52198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204612.52226: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204612.52229: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204612.52236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204612.52323: Set connection var ansible_pipelining to False 40074 1727204612.52329: Set connection var ansible_shell_executable to /bin/sh 40074 1727204612.52332: Set connection var ansible_shell_type to sh 40074 1727204612.52339: Set connection var ansible_connection to ssh 40074 1727204612.52346: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204612.52352: Set connection var ansible_timeout to 10 40074 1727204612.52374: variable 'ansible_shell_executable' from source: unknown 40074 1727204612.52378: variable 'ansible_connection' from source: unknown 40074 1727204612.52380: variable 'ansible_module_compression' from source: unknown 40074 1727204612.52383: variable 'ansible_shell_type' from source: unknown 40074 1727204612.52388: variable 'ansible_shell_executable' from source: unknown 40074 1727204612.52393: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204612.52398: variable 'ansible_pipelining' from source: unknown 40074 1727204612.52402: variable 'ansible_timeout' from source: unknown 40074 1727204612.52407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204612.52524: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204612.52539: variable 'omit' from source: magic vars 40074 1727204612.52545: starting attempt loop 40074 1727204612.52548: running the handler 40074 1727204612.52588: handler run complete 40074 1727204612.52602: attempt loop complete, returning result 40074 1727204612.52605: _execute() done 40074 1727204612.52609: dumping result to json 40074 1727204612.52614: done dumping result, returning 40074 1727204612.52621: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [12b410aa-8751-9fd7-2501-00000000019e] 40074 1727204612.52626: sending task result for task 12b410aa-8751-9fd7-2501-00000000019e 40074 1727204612.52702: done sending task result for task 12b410aa-8751-9fd7-2501-00000000019e 40074 1727204612.52706: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo', 'rpltstbr'] 40074 1727204612.52801: no more pending results, returning what we have 40074 1727204612.52804: results queue empty 40074 1727204612.52805: checking for any_errors_fatal 40074 1727204612.52810: done checking for any_errors_fatal 40074 1727204612.52811: checking for max_fail_percentage 40074 1727204612.52813: done checking for max_fail_percentage 40074 1727204612.52814: checking to see if all hosts have failed and the running result is not ok 40074 1727204612.52815: done checking to see if all hosts have failed 40074 1727204612.52816: getting the remaining hosts for this loop 40074 1727204612.52817: done getting the remaining hosts for this loop 40074 1727204612.52821: getting the next task for host managed-node2 40074 1727204612.52828: done getting next task for host managed-node2 40074 1727204612.52833: ^ task is: TASK: Install iproute 40074 1727204612.52836: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204612.52839: getting variables 40074 1727204612.52841: in VariableManager get_vars() 40074 1727204612.52876: Calling all_inventory to load vars for managed-node2 40074 1727204612.52879: Calling groups_inventory to load vars for managed-node2 40074 1727204612.52882: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204612.52894: Calling all_plugins_play to load vars for managed-node2 40074 1727204612.52897: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204612.52901: Calling groups_plugins_play to load vars for managed-node2 40074 1727204612.53078: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204612.53301: done with get_vars() 40074 1727204612.53309: done getting variables 40074 1727204612.53356: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:03:32 -0400 (0:00:00.023) 0:00:06.295 ***** 40074 1727204612.53379: entering _queue_task() for managed-node2/package 40074 1727204612.53579: worker is 1 (out of 1 available) 40074 1727204612.53596: exiting _queue_task() for managed-node2/package 40074 1727204612.53608: done queuing things up, now waiting for results queue to drain 40074 1727204612.53610: waiting for pending results... 40074 1727204612.53762: running TaskExecutor() for managed-node2/TASK: Install iproute 40074 1727204612.53834: in run() - task 12b410aa-8751-9fd7-2501-00000000016d 40074 1727204612.53846: variable 'ansible_search_path' from source: unknown 40074 1727204612.53849: variable 'ansible_search_path' from source: unknown 40074 1727204612.53880: calling self._execute() 40074 1727204612.53954: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204612.53958: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204612.54065: variable 'omit' from source: magic vars 40074 1727204612.54257: variable 'ansible_distribution_major_version' from source: facts 40074 1727204612.54268: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204612.54280: variable 'omit' from source: magic vars 40074 1727204612.54312: variable 'omit' from source: magic vars 40074 1727204612.54469: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204612.56329: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204612.56385: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204612.56417: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204612.56460: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204612.56484: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204612.56566: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204612.56596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204612.56619: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204612.56652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204612.56665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204612.56749: variable '__network_is_ostree' from source: set_fact 40074 1727204612.56753: variable 'omit' from source: magic vars 40074 1727204612.56780: variable 'omit' from source: magic vars 40074 1727204612.56808: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204612.56835: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204612.56848: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204612.56864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204612.56875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204612.56909: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204612.56914: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204612.56916: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204612.56998: Set connection var ansible_pipelining to False 40074 1727204612.57004: Set connection var ansible_shell_executable to /bin/sh 40074 1727204612.57008: Set connection var ansible_shell_type to sh 40074 1727204612.57013: Set connection var ansible_connection to ssh 40074 1727204612.57021: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204612.57028: Set connection var ansible_timeout to 10 40074 1727204612.57052: variable 'ansible_shell_executable' from source: unknown 40074 1727204612.57055: variable 'ansible_connection' from source: unknown 40074 1727204612.57058: variable 'ansible_module_compression' from source: unknown 40074 1727204612.57062: variable 'ansible_shell_type' from source: unknown 40074 1727204612.57065: variable 'ansible_shell_executable' from source: unknown 40074 1727204612.57070: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204612.57075: variable 'ansible_pipelining' from source: unknown 40074 1727204612.57079: variable 'ansible_timeout' from source: unknown 40074 1727204612.57084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204612.57171: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204612.57181: variable 'omit' from source: magic vars 40074 1727204612.57186: starting attempt loop 40074 1727204612.57191: running the handler 40074 1727204612.57198: variable 'ansible_facts' from source: unknown 40074 1727204612.57201: variable 'ansible_facts' from source: unknown 40074 1727204612.57234: _low_level_execute_command(): starting 40074 1727204612.57238: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204612.57773: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204612.57777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204612.57780: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204612.57782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204612.57844: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204612.57847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204612.57901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204612.59700: stdout chunk (state=3): >>>/root <<< 40074 1727204612.59810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204612.59863: stderr chunk (state=3): >>><<< 40074 1727204612.59866: stdout chunk (state=3): >>><<< 40074 1727204612.59889: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204612.59908: _low_level_execute_command(): starting 40074 1727204612.59911: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974 `" && echo ansible-tmp-1727204612.5989158-40372-52327081087974="` echo /root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974 `" ) && sleep 0' 40074 1727204612.60376: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204612.60380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204612.60382: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204612.60385: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204612.60387: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204612.60439: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204612.60443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204612.60488: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204612.62581: stdout chunk (state=3): >>>ansible-tmp-1727204612.5989158-40372-52327081087974=/root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974 <<< 40074 1727204612.62704: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204612.62757: stderr chunk (state=3): >>><<< 40074 1727204612.62760: stdout chunk (state=3): >>><<< 40074 1727204612.62776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204612.5989158-40372-52327081087974=/root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204612.62806: variable 'ansible_module_compression' from source: unknown 40074 1727204612.62861: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 40074 1727204612.62865: ANSIBALLZ: Acquiring lock 40074 1727204612.62868: ANSIBALLZ: Lock acquired: 139809964199616 40074 1727204612.62870: ANSIBALLZ: Creating module 40074 1727204612.78360: ANSIBALLZ: Writing module into payload 40074 1727204612.78796: ANSIBALLZ: Writing module 40074 1727204612.78800: ANSIBALLZ: Renaming module 40074 1727204612.78802: ANSIBALLZ: Done creating module 40074 1727204612.78804: variable 'ansible_facts' from source: unknown 40074 1727204612.78815: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974/AnsiballZ_dnf.py 40074 1727204612.79072: Sending initial data 40074 1727204612.79083: Sent initial data (151 bytes) 40074 1727204612.79625: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204612.79705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204612.79758: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204612.79773: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204612.79800: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204612.79888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204612.81644: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204612.81706: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204612.81769: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpbfmzm_cd /root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974/AnsiballZ_dnf.py <<< 40074 1727204612.81794: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974/AnsiballZ_dnf.py" <<< 40074 1727204612.81840: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpbfmzm_cd" to remote "/root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974/AnsiballZ_dnf.py" <<< 40074 1727204612.83326: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204612.83426: stderr chunk (state=3): >>><<< 40074 1727204612.83433: stdout chunk (state=3): >>><<< 40074 1727204612.83525: done transferring module to remote 40074 1727204612.83530: _low_level_execute_command(): starting 40074 1727204612.83536: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974/ /root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974/AnsiballZ_dnf.py && sleep 0' 40074 1727204612.84186: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204612.84299: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204612.84327: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204612.84351: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204612.84367: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204612.84443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204612.86404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204612.86488: stderr chunk (state=3): >>><<< 40074 1727204612.86502: stdout chunk (state=3): >>><<< 40074 1727204612.86525: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204612.86534: _low_level_execute_command(): starting 40074 1727204612.86544: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974/AnsiballZ_dnf.py && sleep 0' 40074 1727204612.87179: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204612.87198: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204612.87213: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204612.87231: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204612.87344: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204612.87371: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204612.87451: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204614.40047: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 40074 1727204614.45895: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204614.45961: stderr chunk (state=3): >>><<< 40074 1727204614.45966: stdout chunk (state=3): >>><<< 40074 1727204614.45983: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204614.46037: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204614.46047: _low_level_execute_command(): starting 40074 1727204614.46050: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204612.5989158-40372-52327081087974/ > /dev/null 2>&1 && sleep 0' 40074 1727204614.46542: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204614.46546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.46548: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204614.46551: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204614.46555: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.46610: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204614.46614: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204614.46662: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204614.48637: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204614.48694: stderr chunk (state=3): >>><<< 40074 1727204614.48698: stdout chunk (state=3): >>><<< 40074 1727204614.48715: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204614.48725: handler run complete 40074 1727204614.48869: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204614.49024: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204614.49063: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204614.49092: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204614.49119: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204614.49186: variable '__install_status' from source: unknown 40074 1727204614.49206: Evaluated conditional (__install_status is success): True 40074 1727204614.49221: attempt loop complete, returning result 40074 1727204614.49224: _execute() done 40074 1727204614.49227: dumping result to json 40074 1727204614.49250: done dumping result, returning 40074 1727204614.49253: done running TaskExecutor() for managed-node2/TASK: Install iproute [12b410aa-8751-9fd7-2501-00000000016d] 40074 1727204614.49255: sending task result for task 12b410aa-8751-9fd7-2501-00000000016d 40074 1727204614.49359: done sending task result for task 12b410aa-8751-9fd7-2501-00000000016d 40074 1727204614.49362: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 40074 1727204614.49470: no more pending results, returning what we have 40074 1727204614.49473: results queue empty 40074 1727204614.49475: checking for any_errors_fatal 40074 1727204614.49480: done checking for any_errors_fatal 40074 1727204614.49480: checking for max_fail_percentage 40074 1727204614.49482: done checking for max_fail_percentage 40074 1727204614.49483: checking to see if all hosts have failed and the running result is not ok 40074 1727204614.49484: done checking to see if all hosts have failed 40074 1727204614.49485: getting the remaining hosts for this loop 40074 1727204614.49487: done getting the remaining hosts for this loop 40074 1727204614.49500: getting the next task for host managed-node2 40074 1727204614.49506: done getting next task for host managed-node2 40074 1727204614.49509: ^ task is: TASK: Create veth interface {{ interface }} 40074 1727204614.49512: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204614.49517: getting variables 40074 1727204614.49519: in VariableManager get_vars() 40074 1727204614.49563: Calling all_inventory to load vars for managed-node2 40074 1727204614.49566: Calling groups_inventory to load vars for managed-node2 40074 1727204614.49568: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204614.49580: Calling all_plugins_play to load vars for managed-node2 40074 1727204614.49583: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204614.49586: Calling groups_plugins_play to load vars for managed-node2 40074 1727204614.49839: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204614.50042: done with get_vars() 40074 1727204614.50052: done getting variables 40074 1727204614.50105: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204614.50212: variable 'interface' from source: set_fact TASK [Create veth interface ethtest0] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:03:34 -0400 (0:00:01.968) 0:00:08.264 ***** 40074 1727204614.50251: entering _queue_task() for managed-node2/command 40074 1727204614.50492: worker is 1 (out of 1 available) 40074 1727204614.50507: exiting _queue_task() for managed-node2/command 40074 1727204614.50522: done queuing things up, now waiting for results queue to drain 40074 1727204614.50523: waiting for pending results... 40074 1727204614.50688: running TaskExecutor() for managed-node2/TASK: Create veth interface ethtest0 40074 1727204614.50764: in run() - task 12b410aa-8751-9fd7-2501-00000000016e 40074 1727204614.50779: variable 'ansible_search_path' from source: unknown 40074 1727204614.50783: variable 'ansible_search_path' from source: unknown 40074 1727204614.51019: variable 'interface' from source: set_fact 40074 1727204614.51091: variable 'interface' from source: set_fact 40074 1727204614.51162: variable 'interface' from source: set_fact 40074 1727204614.51368: Loaded config def from plugin (lookup/items) 40074 1727204614.51373: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 40074 1727204614.51402: variable 'omit' from source: magic vars 40074 1727204614.51490: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204614.51500: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204614.51510: variable 'omit' from source: magic vars 40074 1727204614.51702: variable 'ansible_distribution_major_version' from source: facts 40074 1727204614.51710: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204614.51890: variable 'type' from source: set_fact 40074 1727204614.51896: variable 'state' from source: include params 40074 1727204614.51902: variable 'interface' from source: set_fact 40074 1727204614.51907: variable 'current_interfaces' from source: set_fact 40074 1727204614.51914: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 40074 1727204614.51921: variable 'omit' from source: magic vars 40074 1727204614.51955: variable 'omit' from source: magic vars 40074 1727204614.51999: variable 'item' from source: unknown 40074 1727204614.52062: variable 'item' from source: unknown 40074 1727204614.52082: variable 'omit' from source: magic vars 40074 1727204614.52117: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204614.52145: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204614.52161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204614.52179: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204614.52198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204614.52224: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204614.52228: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204614.52235: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204614.52326: Set connection var ansible_pipelining to False 40074 1727204614.52334: Set connection var ansible_shell_executable to /bin/sh 40074 1727204614.52338: Set connection var ansible_shell_type to sh 40074 1727204614.52340: Set connection var ansible_connection to ssh 40074 1727204614.52346: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204614.52352: Set connection var ansible_timeout to 10 40074 1727204614.52373: variable 'ansible_shell_executable' from source: unknown 40074 1727204614.52376: variable 'ansible_connection' from source: unknown 40074 1727204614.52379: variable 'ansible_module_compression' from source: unknown 40074 1727204614.52386: variable 'ansible_shell_type' from source: unknown 40074 1727204614.52389: variable 'ansible_shell_executable' from source: unknown 40074 1727204614.52391: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204614.52402: variable 'ansible_pipelining' from source: unknown 40074 1727204614.52404: variable 'ansible_timeout' from source: unknown 40074 1727204614.52407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204614.52595: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204614.52599: variable 'omit' from source: magic vars 40074 1727204614.52602: starting attempt loop 40074 1727204614.52604: running the handler 40074 1727204614.52608: _low_level_execute_command(): starting 40074 1727204614.52610: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204614.53141: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204614.53145: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.53148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204614.53150: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.53219: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204614.53222: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204614.53269: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204614.55103: stdout chunk (state=3): >>>/root <<< 40074 1727204614.55216: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204614.55278: stderr chunk (state=3): >>><<< 40074 1727204614.55281: stdout chunk (state=3): >>><<< 40074 1727204614.55307: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204614.55319: _low_level_execute_command(): starting 40074 1727204614.55326: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028 `" && echo ansible-tmp-1727204614.553067-40552-224444385253028="` echo /root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028 `" ) && sleep 0' 40074 1727204614.55835: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204614.55838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.55841: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204614.55843: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204614.55845: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.55903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204614.55909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204614.55950: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204614.58124: stdout chunk (state=3): >>>ansible-tmp-1727204614.553067-40552-224444385253028=/root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028 <<< 40074 1727204614.58341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204614.58345: stdout chunk (state=3): >>><<< 40074 1727204614.58347: stderr chunk (state=3): >>><<< 40074 1727204614.58370: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204614.553067-40552-224444385253028=/root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204614.58495: variable 'ansible_module_compression' from source: unknown 40074 1727204614.58499: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204614.58522: variable 'ansible_facts' from source: unknown 40074 1727204614.58628: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028/AnsiballZ_command.py 40074 1727204614.58815: Sending initial data 40074 1727204614.58827: Sent initial data (155 bytes) 40074 1727204614.59435: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204614.59509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.59568: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204614.59588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204614.59629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204614.59703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204614.61494: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204614.61528: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204614.61564: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpcnqdmtm1 /root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028/AnsiballZ_command.py <<< 40074 1727204614.61571: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028/AnsiballZ_command.py" <<< 40074 1727204614.61602: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpcnqdmtm1" to remote "/root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028/AnsiballZ_command.py" <<< 40074 1727204614.62380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204614.62455: stderr chunk (state=3): >>><<< 40074 1727204614.62458: stdout chunk (state=3): >>><<< 40074 1727204614.62478: done transferring module to remote 40074 1727204614.62491: _low_level_execute_command(): starting 40074 1727204614.62497: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028/ /root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028/AnsiballZ_command.py && sleep 0' 40074 1727204614.62962: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204614.62965: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.62968: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204614.62975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.63027: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204614.63034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204614.63072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204614.65006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204614.65063: stderr chunk (state=3): >>><<< 40074 1727204614.65066: stdout chunk (state=3): >>><<< 40074 1727204614.65083: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204614.65086: _low_level_execute_command(): starting 40074 1727204614.65094: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028/AnsiballZ_command.py && sleep 0' 40074 1727204614.65576: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204614.65579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204614.65582: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.65584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204614.65587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.65639: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204614.65643: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204614.65696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204614.84732: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:03:34.833617", "end": "2024-09-24 15:03:34.841023", "delta": "0:00:00.007406", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204614.87836: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204614.87901: stderr chunk (state=3): >>><<< 40074 1727204614.87905: stdout chunk (state=3): >>><<< 40074 1727204614.87921: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0"], "start": "2024-09-24 15:03:34.833617", "end": "2024-09-24 15:03:34.841023", "delta": "0:00:00.007406", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest0 type veth peer name peerethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204614.87961: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest0 type veth peer name peerethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204614.87972: _low_level_execute_command(): starting 40074 1727204614.87979: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204614.553067-40552-224444385253028/ > /dev/null 2>&1 && sleep 0' 40074 1727204614.88478: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204614.88482: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204614.88491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204614.88494: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204614.88496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.88557: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204614.88560: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204614.88686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204614.93017: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204614.93084: stderr chunk (state=3): >>><<< 40074 1727204614.93088: stdout chunk (state=3): >>><<< 40074 1727204614.93105: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204614.93114: handler run complete 40074 1727204614.93139: Evaluated conditional (False): False 40074 1727204614.93153: attempt loop complete, returning result 40074 1727204614.93177: variable 'item' from source: unknown 40074 1727204614.93249: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link add ethtest0 type veth peer name peerethtest0) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest0", "type", "veth", "peer", "name", "peerethtest0" ], "delta": "0:00:00.007406", "end": "2024-09-24 15:03:34.841023", "item": "ip link add ethtest0 type veth peer name peerethtest0", "rc": 0, "start": "2024-09-24 15:03:34.833617" } 40074 1727204614.93444: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204614.93447: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204614.93450: variable 'omit' from source: magic vars 40074 1727204614.93551: variable 'ansible_distribution_major_version' from source: facts 40074 1727204614.93557: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204614.93719: variable 'type' from source: set_fact 40074 1727204614.93722: variable 'state' from source: include params 40074 1727204614.93728: variable 'interface' from source: set_fact 40074 1727204614.93735: variable 'current_interfaces' from source: set_fact 40074 1727204614.93741: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 40074 1727204614.93746: variable 'omit' from source: magic vars 40074 1727204614.93760: variable 'omit' from source: magic vars 40074 1727204614.93800: variable 'item' from source: unknown 40074 1727204614.93852: variable 'item' from source: unknown 40074 1727204614.93865: variable 'omit' from source: magic vars 40074 1727204614.93885: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204614.93897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204614.93910: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204614.93923: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204614.93926: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204614.93931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204614.94000: Set connection var ansible_pipelining to False 40074 1727204614.94014: Set connection var ansible_shell_executable to /bin/sh 40074 1727204614.94017: Set connection var ansible_shell_type to sh 40074 1727204614.94020: Set connection var ansible_connection to ssh 40074 1727204614.94022: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204614.94030: Set connection var ansible_timeout to 10 40074 1727204614.94051: variable 'ansible_shell_executable' from source: unknown 40074 1727204614.94055: variable 'ansible_connection' from source: unknown 40074 1727204614.94057: variable 'ansible_module_compression' from source: unknown 40074 1727204614.94062: variable 'ansible_shell_type' from source: unknown 40074 1727204614.94064: variable 'ansible_shell_executable' from source: unknown 40074 1727204614.94069: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204614.94074: variable 'ansible_pipelining' from source: unknown 40074 1727204614.94077: variable 'ansible_timeout' from source: unknown 40074 1727204614.94083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204614.94169: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204614.94178: variable 'omit' from source: magic vars 40074 1727204614.94183: starting attempt loop 40074 1727204614.94186: running the handler 40074 1727204614.94196: _low_level_execute_command(): starting 40074 1727204614.94201: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204614.94687: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204614.94702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.94705: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204614.94708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.94756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204614.94759: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204614.94811: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204614.96598: stdout chunk (state=3): >>>/root <<< 40074 1727204614.96701: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204614.96759: stderr chunk (state=3): >>><<< 40074 1727204614.96764: stdout chunk (state=3): >>><<< 40074 1727204614.96779: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204614.96791: _low_level_execute_command(): starting 40074 1727204614.96797: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719 `" && echo ansible-tmp-1727204614.9678028-40552-254966347079719="` echo /root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719 `" ) && sleep 0' 40074 1727204614.97274: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204614.97278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.97280: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204614.97283: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204614.97285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204614.97342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204614.97348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204614.97391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204614.99465: stdout chunk (state=3): >>>ansible-tmp-1727204614.9678028-40552-254966347079719=/root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719 <<< 40074 1727204614.99586: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204614.99641: stderr chunk (state=3): >>><<< 40074 1727204614.99644: stdout chunk (state=3): >>><<< 40074 1727204614.99661: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204614.9678028-40552-254966347079719=/root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204614.99686: variable 'ansible_module_compression' from source: unknown 40074 1727204614.99719: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204614.99736: variable 'ansible_facts' from source: unknown 40074 1727204614.99787: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719/AnsiballZ_command.py 40074 1727204614.99888: Sending initial data 40074 1727204614.99895: Sent initial data (156 bytes) 40074 1727204615.00385: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204615.00388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.00395: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204615.00397: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204615.00400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.00455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204615.00459: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.00500: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.02229: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204615.02265: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204615.02308: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpjod9cmbg /root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719/AnsiballZ_command.py <<< 40074 1727204615.02317: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719/AnsiballZ_command.py" <<< 40074 1727204615.02349: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 40074 1727204615.02352: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpjod9cmbg" to remote "/root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719/AnsiballZ_command.py" <<< 40074 1727204615.03142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204615.03219: stderr chunk (state=3): >>><<< 40074 1727204615.03223: stdout chunk (state=3): >>><<< 40074 1727204615.03244: done transferring module to remote 40074 1727204615.03253: _low_level_execute_command(): starting 40074 1727204615.03259: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719/ /root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719/AnsiballZ_command.py && sleep 0' 40074 1727204615.03758: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204615.03762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204615.03764: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.03767: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204615.03769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.03832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204615.03835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.03871: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.05833: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204615.05888: stderr chunk (state=3): >>><<< 40074 1727204615.05895: stdout chunk (state=3): >>><<< 40074 1727204615.05912: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204615.05917: _low_level_execute_command(): starting 40074 1727204615.05924: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719/AnsiballZ_command.py && sleep 0' 40074 1727204615.06416: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204615.06419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204615.06422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 40074 1727204615.06424: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204615.06426: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.06481: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204615.06484: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.06541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.24692: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:03:35.241807", "end": "2024-09-24 15:03:35.245754", "delta": "0:00:00.003947", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204615.26467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204615.26530: stderr chunk (state=3): >>><<< 40074 1727204615.26536: stdout chunk (state=3): >>><<< 40074 1727204615.26552: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest0", "up"], "start": "2024-09-24 15:03:35.241807", "end": "2024-09-24 15:03:35.245754", "delta": "0:00:00.003947", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204615.26587: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204615.26598: _low_level_execute_command(): starting 40074 1727204615.26605: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204614.9678028-40552-254966347079719/ > /dev/null 2>&1 && sleep 0' 40074 1727204615.27099: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204615.27103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204615.27105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 40074 1727204615.27108: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204615.27110: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.27168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204615.27171: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.27217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.29175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204615.29232: stderr chunk (state=3): >>><<< 40074 1727204615.29237: stdout chunk (state=3): >>><<< 40074 1727204615.29254: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204615.29260: handler run complete 40074 1727204615.29280: Evaluated conditional (False): False 40074 1727204615.29296: attempt loop complete, returning result 40074 1727204615.29315: variable 'item' from source: unknown 40074 1727204615.29384: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set peerethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest0", "up" ], "delta": "0:00:00.003947", "end": "2024-09-24 15:03:35.245754", "item": "ip link set peerethtest0 up", "rc": 0, "start": "2024-09-24 15:03:35.241807" } 40074 1727204615.29611: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204615.29615: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204615.29618: variable 'omit' from source: magic vars 40074 1727204615.29697: variable 'ansible_distribution_major_version' from source: facts 40074 1727204615.29706: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204615.29874: variable 'type' from source: set_fact 40074 1727204615.29877: variable 'state' from source: include params 40074 1727204615.29883: variable 'interface' from source: set_fact 40074 1727204615.29888: variable 'current_interfaces' from source: set_fact 40074 1727204615.29899: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 40074 1727204615.29904: variable 'omit' from source: magic vars 40074 1727204615.29919: variable 'omit' from source: magic vars 40074 1727204615.29960: variable 'item' from source: unknown 40074 1727204615.30017: variable 'item' from source: unknown 40074 1727204615.30032: variable 'omit' from source: magic vars 40074 1727204615.30055: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204615.30063: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204615.30070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204615.30083: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204615.30086: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204615.30092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204615.30160: Set connection var ansible_pipelining to False 40074 1727204615.30169: Set connection var ansible_shell_executable to /bin/sh 40074 1727204615.30172: Set connection var ansible_shell_type to sh 40074 1727204615.30175: Set connection var ansible_connection to ssh 40074 1727204615.30182: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204615.30188: Set connection var ansible_timeout to 10 40074 1727204615.30209: variable 'ansible_shell_executable' from source: unknown 40074 1727204615.30212: variable 'ansible_connection' from source: unknown 40074 1727204615.30215: variable 'ansible_module_compression' from source: unknown 40074 1727204615.30219: variable 'ansible_shell_type' from source: unknown 40074 1727204615.30222: variable 'ansible_shell_executable' from source: unknown 40074 1727204615.30227: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204615.30234: variable 'ansible_pipelining' from source: unknown 40074 1727204615.30237: variable 'ansible_timeout' from source: unknown 40074 1727204615.30239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204615.30325: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204615.30335: variable 'omit' from source: magic vars 40074 1727204615.30338: starting attempt loop 40074 1727204615.30341: running the handler 40074 1727204615.30348: _low_level_execute_command(): starting 40074 1727204615.30353: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204615.30845: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204615.30849: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.30851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204615.30853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.30907: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204615.30912: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.30961: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.32732: stdout chunk (state=3): >>>/root <<< 40074 1727204615.32842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204615.32903: stderr chunk (state=3): >>><<< 40074 1727204615.32906: stdout chunk (state=3): >>><<< 40074 1727204615.32921: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204615.32930: _low_level_execute_command(): starting 40074 1727204615.32938: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470 `" && echo ansible-tmp-1727204615.3292155-40552-164950524410470="` echo /root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470 `" ) && sleep 0' 40074 1727204615.33431: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204615.33435: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204615.33439: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204615.33441: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204615.33444: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.33495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204615.33498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.33549: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.35626: stdout chunk (state=3): >>>ansible-tmp-1727204615.3292155-40552-164950524410470=/root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470 <<< 40074 1727204615.35744: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204615.35799: stderr chunk (state=3): >>><<< 40074 1727204615.35803: stdout chunk (state=3): >>><<< 40074 1727204615.35822: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204615.3292155-40552-164950524410470=/root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204615.35845: variable 'ansible_module_compression' from source: unknown 40074 1727204615.35875: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204615.35896: variable 'ansible_facts' from source: unknown 40074 1727204615.35950: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470/AnsiballZ_command.py 40074 1727204615.36056: Sending initial data 40074 1727204615.36060: Sent initial data (156 bytes) 40074 1727204615.36539: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204615.36542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204615.36545: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.36550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204615.36553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.36603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204615.36606: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.36653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.38303: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 40074 1727204615.38307: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204615.38344: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204615.38384: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpvi5_wz7t /root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470/AnsiballZ_command.py <<< 40074 1727204615.38394: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470/AnsiballZ_command.py" <<< 40074 1727204615.38422: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpvi5_wz7t" to remote "/root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470/AnsiballZ_command.py" <<< 40074 1727204615.39192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204615.39267: stderr chunk (state=3): >>><<< 40074 1727204615.39271: stdout chunk (state=3): >>><<< 40074 1727204615.39287: done transferring module to remote 40074 1727204615.39297: _low_level_execute_command(): starting 40074 1727204615.39304: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470/ /root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470/AnsiballZ_command.py && sleep 0' 40074 1727204615.39795: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204615.39798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204615.39801: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204615.39803: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204615.39806: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.39859: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204615.39863: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.39908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.41860: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204615.41916: stderr chunk (state=3): >>><<< 40074 1727204615.41919: stdout chunk (state=3): >>><<< 40074 1727204615.41936: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204615.41940: _low_level_execute_command(): starting 40074 1727204615.41947: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470/AnsiballZ_command.py && sleep 0' 40074 1727204615.42412: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204615.42416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.42419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204615.42421: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204615.42423: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.42475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204615.42482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.42528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.61038: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:03:35.604276", "end": "2024-09-24 15:03:35.608470", "delta": "0:00:00.004194", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204615.63057: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204615.63061: stdout chunk (state=3): >>><<< 40074 1727204615.63064: stderr chunk (state=3): >>><<< 40074 1727204615.63067: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest0", "up"], "start": "2024-09-24 15:03:35.604276", "end": "2024-09-24 15:03:35.608470", "delta": "0:00:00.004194", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest0 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204615.63069: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest0 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204615.63071: _low_level_execute_command(): starting 40074 1727204615.63074: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204615.3292155-40552-164950524410470/ > /dev/null 2>&1 && sleep 0' 40074 1727204615.63661: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204615.63678: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204615.63697: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204615.63825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204615.63855: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.63939: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.66023: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204615.66041: stdout chunk (state=3): >>><<< 40074 1727204615.66060: stderr chunk (state=3): >>><<< 40074 1727204615.66195: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204615.66199: handler run complete 40074 1727204615.66202: Evaluated conditional (False): False 40074 1727204615.66204: attempt loop complete, returning result 40074 1727204615.66206: variable 'item' from source: unknown 40074 1727204615.66269: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set ethtest0 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest0", "up" ], "delta": "0:00:00.004194", "end": "2024-09-24 15:03:35.608470", "item": "ip link set ethtest0 up", "rc": 0, "start": "2024-09-24 15:03:35.604276" } 40074 1727204615.66496: dumping result to json 40074 1727204615.66499: done dumping result, returning 40074 1727204615.66502: done running TaskExecutor() for managed-node2/TASK: Create veth interface ethtest0 [12b410aa-8751-9fd7-2501-00000000016e] 40074 1727204615.66504: sending task result for task 12b410aa-8751-9fd7-2501-00000000016e 40074 1727204615.66731: done sending task result for task 12b410aa-8751-9fd7-2501-00000000016e 40074 1727204615.66734: WORKER PROCESS EXITING 40074 1727204615.66840: no more pending results, returning what we have 40074 1727204615.66844: results queue empty 40074 1727204615.66845: checking for any_errors_fatal 40074 1727204615.66850: done checking for any_errors_fatal 40074 1727204615.66851: checking for max_fail_percentage 40074 1727204615.66853: done checking for max_fail_percentage 40074 1727204615.66854: checking to see if all hosts have failed and the running result is not ok 40074 1727204615.66856: done checking to see if all hosts have failed 40074 1727204615.66862: getting the remaining hosts for this loop 40074 1727204615.66864: done getting the remaining hosts for this loop 40074 1727204615.66868: getting the next task for host managed-node2 40074 1727204615.66874: done getting next task for host managed-node2 40074 1727204615.66878: ^ task is: TASK: Set up veth as managed by NetworkManager 40074 1727204615.66882: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204615.66887: getting variables 40074 1727204615.67009: in VariableManager get_vars() 40074 1727204615.67048: Calling all_inventory to load vars for managed-node2 40074 1727204615.67052: Calling groups_inventory to load vars for managed-node2 40074 1727204615.67055: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204615.67068: Calling all_plugins_play to load vars for managed-node2 40074 1727204615.67071: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204615.67075: Calling groups_plugins_play to load vars for managed-node2 40074 1727204615.67518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204615.67893: done with get_vars() 40074 1727204615.67907: done getting variables 40074 1727204615.67982: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:03:35 -0400 (0:00:01.177) 0:00:09.441 ***** 40074 1727204615.68016: entering _queue_task() for managed-node2/command 40074 1727204615.68336: worker is 1 (out of 1 available) 40074 1727204615.68350: exiting _queue_task() for managed-node2/command 40074 1727204615.68363: done queuing things up, now waiting for results queue to drain 40074 1727204615.68365: waiting for pending results... 40074 1727204615.68721: running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager 40074 1727204615.68774: in run() - task 12b410aa-8751-9fd7-2501-00000000016f 40074 1727204615.68798: variable 'ansible_search_path' from source: unknown 40074 1727204615.68806: variable 'ansible_search_path' from source: unknown 40074 1727204615.68858: calling self._execute() 40074 1727204615.68968: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204615.69036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204615.69040: variable 'omit' from source: magic vars 40074 1727204615.69858: variable 'ansible_distribution_major_version' from source: facts 40074 1727204615.69877: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204615.70096: variable 'type' from source: set_fact 40074 1727204615.70109: variable 'state' from source: include params 40074 1727204615.70127: Evaluated conditional (type == 'veth' and state == 'present'): True 40074 1727204615.70140: variable 'omit' from source: magic vars 40074 1727204615.70193: variable 'omit' from source: magic vars 40074 1727204615.70326: variable 'interface' from source: set_fact 40074 1727204615.70394: variable 'omit' from source: magic vars 40074 1727204615.70414: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204615.70467: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204615.70497: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204615.70524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204615.70544: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204615.70673: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204615.70676: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204615.70678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204615.70753: Set connection var ansible_pipelining to False 40074 1727204615.70768: Set connection var ansible_shell_executable to /bin/sh 40074 1727204615.70783: Set connection var ansible_shell_type to sh 40074 1727204615.70793: Set connection var ansible_connection to ssh 40074 1727204615.70807: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204615.70819: Set connection var ansible_timeout to 10 40074 1727204615.70855: variable 'ansible_shell_executable' from source: unknown 40074 1727204615.70864: variable 'ansible_connection' from source: unknown 40074 1727204615.70872: variable 'ansible_module_compression' from source: unknown 40074 1727204615.70880: variable 'ansible_shell_type' from source: unknown 40074 1727204615.70900: variable 'ansible_shell_executable' from source: unknown 40074 1727204615.70909: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204615.70919: variable 'ansible_pipelining' from source: unknown 40074 1727204615.70927: variable 'ansible_timeout' from source: unknown 40074 1727204615.70936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204615.71195: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204615.71199: variable 'omit' from source: magic vars 40074 1727204615.71202: starting attempt loop 40074 1727204615.71204: running the handler 40074 1727204615.71206: _low_level_execute_command(): starting 40074 1727204615.71208: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204615.71983: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204615.72000: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.72108: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 40074 1727204615.72131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204615.72149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.72223: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.74026: stdout chunk (state=3): >>>/root <<< 40074 1727204615.74214: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204615.74230: stdout chunk (state=3): >>><<< 40074 1727204615.74252: stderr chunk (state=3): >>><<< 40074 1727204615.74283: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204615.74307: _low_level_execute_command(): starting 40074 1727204615.74395: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985 `" && echo ansible-tmp-1727204615.7429247-40634-128075213514985="` echo /root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985 `" ) && sleep 0' 40074 1727204615.75002: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204615.75016: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204615.75075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.75157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204615.75194: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204615.75210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.75279: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.77373: stdout chunk (state=3): >>>ansible-tmp-1727204615.7429247-40634-128075213514985=/root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985 <<< 40074 1727204615.77577: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204615.77602: stderr chunk (state=3): >>><<< 40074 1727204615.77620: stdout chunk (state=3): >>><<< 40074 1727204615.77795: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204615.7429247-40634-128075213514985=/root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204615.77799: variable 'ansible_module_compression' from source: unknown 40074 1727204615.77801: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204615.77804: variable 'ansible_facts' from source: unknown 40074 1727204615.77911: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985/AnsiballZ_command.py 40074 1727204615.78172: Sending initial data 40074 1727204615.78175: Sent initial data (156 bytes) 40074 1727204615.78834: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.78922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204615.78951: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.79041: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.80749: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204615.80823: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204615.80988: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpaa_ero57 /root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985/AnsiballZ_command.py <<< 40074 1727204615.81010: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpaa_ero57" to remote "/root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985/AnsiballZ_command.py" <<< 40074 1727204615.82229: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204615.82352: stderr chunk (state=3): >>><<< 40074 1727204615.82365: stdout chunk (state=3): >>><<< 40074 1727204615.82404: done transferring module to remote 40074 1727204615.82421: _low_level_execute_command(): starting 40074 1727204615.82433: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985/ /root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985/AnsiballZ_command.py && sleep 0' 40074 1727204615.83129: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204615.83156: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204615.83177: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204615.83266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204615.83316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204615.83337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204615.83370: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.83433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204615.85510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204615.85553: stderr chunk (state=3): >>><<< 40074 1727204615.85557: stdout chunk (state=3): >>><<< 40074 1727204615.85666: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204615.85674: _low_level_execute_command(): starting 40074 1727204615.85678: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985/AnsiballZ_command.py && sleep 0' 40074 1727204615.86243: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204615.86261: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204615.86277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204615.86304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204615.86323: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204615.86337: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204615.86443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204615.86465: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204615.86552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204616.06407: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:03:36.040271", "end": "2024-09-24 15:03:36.061946", "delta": "0:00:00.021675", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204616.08543: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204616.08581: stdout chunk (state=3): >>><<< 40074 1727204616.08585: stderr chunk (state=3): >>><<< 40074 1727204616.08609: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest0", "managed", "true"], "start": "2024-09-24 15:03:36.040271", "end": "2024-09-24 15:03:36.061946", "delta": "0:00:00.021675", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest0 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204616.08669: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest0 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204616.08768: _low_level_execute_command(): starting 40074 1727204616.08772: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204615.7429247-40634-128075213514985/ > /dev/null 2>&1 && sleep 0' 40074 1727204616.09408: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204616.09426: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204616.09452: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204616.09576: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 40074 1727204616.09600: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204616.09617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204616.09703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204616.12516: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204616.12553: stderr chunk (state=3): >>><<< 40074 1727204616.12714: stdout chunk (state=3): >>><<< 40074 1727204616.13042: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204616.13050: handler run complete 40074 1727204616.13054: Evaluated conditional (False): False 40074 1727204616.13056: attempt loop complete, returning result 40074 1727204616.13058: _execute() done 40074 1727204616.13060: dumping result to json 40074 1727204616.13062: done dumping result, returning 40074 1727204616.13066: done running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager [12b410aa-8751-9fd7-2501-00000000016f] 40074 1727204616.13076: sending task result for task 12b410aa-8751-9fd7-2501-00000000016f ok: [managed-node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest0", "managed", "true" ], "delta": "0:00:00.021675", "end": "2024-09-24 15:03:36.061946", "rc": 0, "start": "2024-09-24 15:03:36.040271" } 40074 1727204616.13328: no more pending results, returning what we have 40074 1727204616.13332: results queue empty 40074 1727204616.13334: checking for any_errors_fatal 40074 1727204616.13350: done checking for any_errors_fatal 40074 1727204616.13351: checking for max_fail_percentage 40074 1727204616.13353: done checking for max_fail_percentage 40074 1727204616.13355: checking to see if all hosts have failed and the running result is not ok 40074 1727204616.13356: done checking to see if all hosts have failed 40074 1727204616.13357: getting the remaining hosts for this loop 40074 1727204616.13359: done getting the remaining hosts for this loop 40074 1727204616.13365: getting the next task for host managed-node2 40074 1727204616.13373: done getting next task for host managed-node2 40074 1727204616.13377: ^ task is: TASK: Delete veth interface {{ interface }} 40074 1727204616.13380: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204616.13387: getting variables 40074 1727204616.13521: in VariableManager get_vars() 40074 1727204616.13574: Calling all_inventory to load vars for managed-node2 40074 1727204616.13578: Calling groups_inventory to load vars for managed-node2 40074 1727204616.13581: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.13746: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.13750: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.13755: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.14433: done sending task result for task 12b410aa-8751-9fd7-2501-00000000016f 40074 1727204616.14437: WORKER PROCESS EXITING 40074 1727204616.14546: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.14924: done with get_vars() 40074 1727204616.14948: done getting variables 40074 1727204616.15026: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204616.15193: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest0] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.472) 0:00:09.914 ***** 40074 1727204616.15228: entering _queue_task() for managed-node2/command 40074 1727204616.16073: worker is 1 (out of 1 available) 40074 1727204616.16086: exiting _queue_task() for managed-node2/command 40074 1727204616.16103: done queuing things up, now waiting for results queue to drain 40074 1727204616.16105: waiting for pending results... 40074 1727204616.16378: running TaskExecutor() for managed-node2/TASK: Delete veth interface ethtest0 40074 1727204616.16523: in run() - task 12b410aa-8751-9fd7-2501-000000000170 40074 1727204616.16546: variable 'ansible_search_path' from source: unknown 40074 1727204616.16554: variable 'ansible_search_path' from source: unknown 40074 1727204616.16615: calling self._execute() 40074 1727204616.16796: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.16800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.16804: variable 'omit' from source: magic vars 40074 1727204616.17185: variable 'ansible_distribution_major_version' from source: facts 40074 1727204616.17202: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204616.17388: variable 'type' from source: set_fact 40074 1727204616.17398: variable 'state' from source: include params 40074 1727204616.17405: variable 'interface' from source: set_fact 40074 1727204616.17412: variable 'current_interfaces' from source: set_fact 40074 1727204616.17421: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 40074 1727204616.17427: when evaluation is False, skipping this task 40074 1727204616.17432: _execute() done 40074 1727204616.17438: dumping result to json 40074 1727204616.17443: done dumping result, returning 40074 1727204616.17450: done running TaskExecutor() for managed-node2/TASK: Delete veth interface ethtest0 [12b410aa-8751-9fd7-2501-000000000170] 40074 1727204616.17456: sending task result for task 12b410aa-8751-9fd7-2501-000000000170 40074 1727204616.17554: done sending task result for task 12b410aa-8751-9fd7-2501-000000000170 40074 1727204616.17557: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 40074 1727204616.17635: no more pending results, returning what we have 40074 1727204616.17638: results queue empty 40074 1727204616.17639: checking for any_errors_fatal 40074 1727204616.17647: done checking for any_errors_fatal 40074 1727204616.17648: checking for max_fail_percentage 40074 1727204616.17650: done checking for max_fail_percentage 40074 1727204616.17651: checking to see if all hosts have failed and the running result is not ok 40074 1727204616.17652: done checking to see if all hosts have failed 40074 1727204616.17653: getting the remaining hosts for this loop 40074 1727204616.17654: done getting the remaining hosts for this loop 40074 1727204616.17658: getting the next task for host managed-node2 40074 1727204616.17664: done getting next task for host managed-node2 40074 1727204616.17667: ^ task is: TASK: Create dummy interface {{ interface }} 40074 1727204616.17670: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204616.17674: getting variables 40074 1727204616.17675: in VariableManager get_vars() 40074 1727204616.17721: Calling all_inventory to load vars for managed-node2 40074 1727204616.17724: Calling groups_inventory to load vars for managed-node2 40074 1727204616.17726: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.17738: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.17741: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.17745: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.17922: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.18130: done with get_vars() 40074 1727204616.18140: done getting variables 40074 1727204616.18188: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204616.18285: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.030) 0:00:09.944 ***** 40074 1727204616.18311: entering _queue_task() for managed-node2/command 40074 1727204616.18531: worker is 1 (out of 1 available) 40074 1727204616.18546: exiting _queue_task() for managed-node2/command 40074 1727204616.18567: done queuing things up, now waiting for results queue to drain 40074 1727204616.18569: waiting for pending results... 40074 1727204616.18904: running TaskExecutor() for managed-node2/TASK: Create dummy interface ethtest0 40074 1727204616.18909: in run() - task 12b410aa-8751-9fd7-2501-000000000171 40074 1727204616.18913: variable 'ansible_search_path' from source: unknown 40074 1727204616.18922: variable 'ansible_search_path' from source: unknown 40074 1727204616.18966: calling self._execute() 40074 1727204616.19068: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.19081: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.19107: variable 'omit' from source: magic vars 40074 1727204616.19521: variable 'ansible_distribution_major_version' from source: facts 40074 1727204616.19543: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204616.19818: variable 'type' from source: set_fact 40074 1727204616.19822: variable 'state' from source: include params 40074 1727204616.19826: variable 'interface' from source: set_fact 40074 1727204616.19860: variable 'current_interfaces' from source: set_fact 40074 1727204616.19873: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 40074 1727204616.19877: when evaluation is False, skipping this task 40074 1727204616.19879: _execute() done 40074 1727204616.19882: dumping result to json 40074 1727204616.19884: done dumping result, returning 40074 1727204616.19887: done running TaskExecutor() for managed-node2/TASK: Create dummy interface ethtest0 [12b410aa-8751-9fd7-2501-000000000171] 40074 1727204616.19892: sending task result for task 12b410aa-8751-9fd7-2501-000000000171 skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 40074 1727204616.20037: no more pending results, returning what we have 40074 1727204616.20041: results queue empty 40074 1727204616.20042: checking for any_errors_fatal 40074 1727204616.20048: done checking for any_errors_fatal 40074 1727204616.20049: checking for max_fail_percentage 40074 1727204616.20050: done checking for max_fail_percentage 40074 1727204616.20051: checking to see if all hosts have failed and the running result is not ok 40074 1727204616.20053: done checking to see if all hosts have failed 40074 1727204616.20053: getting the remaining hosts for this loop 40074 1727204616.20055: done getting the remaining hosts for this loop 40074 1727204616.20059: getting the next task for host managed-node2 40074 1727204616.20064: done getting next task for host managed-node2 40074 1727204616.20067: ^ task is: TASK: Delete dummy interface {{ interface }} 40074 1727204616.20070: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204616.20079: getting variables 40074 1727204616.20081: in VariableManager get_vars() 40074 1727204616.20142: Calling all_inventory to load vars for managed-node2 40074 1727204616.20145: Calling groups_inventory to load vars for managed-node2 40074 1727204616.20148: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.20160: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.20163: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.20166: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.20530: done sending task result for task 12b410aa-8751-9fd7-2501-000000000171 40074 1727204616.20534: WORKER PROCESS EXITING 40074 1727204616.20561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.20937: done with get_vars() 40074 1727204616.20955: done getting variables 40074 1727204616.21032: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204616.21179: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.028) 0:00:09.973 ***** 40074 1727204616.21214: entering _queue_task() for managed-node2/command 40074 1727204616.21491: worker is 1 (out of 1 available) 40074 1727204616.21507: exiting _queue_task() for managed-node2/command 40074 1727204616.21519: done queuing things up, now waiting for results queue to drain 40074 1727204616.21521: waiting for pending results... 40074 1727204616.21692: running TaskExecutor() for managed-node2/TASK: Delete dummy interface ethtest0 40074 1727204616.21768: in run() - task 12b410aa-8751-9fd7-2501-000000000172 40074 1727204616.21779: variable 'ansible_search_path' from source: unknown 40074 1727204616.21784: variable 'ansible_search_path' from source: unknown 40074 1727204616.21818: calling self._execute() 40074 1727204616.21897: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.21904: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.21914: variable 'omit' from source: magic vars 40074 1727204616.22218: variable 'ansible_distribution_major_version' from source: facts 40074 1727204616.22229: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204616.22402: variable 'type' from source: set_fact 40074 1727204616.22406: variable 'state' from source: include params 40074 1727204616.22410: variable 'interface' from source: set_fact 40074 1727204616.22414: variable 'current_interfaces' from source: set_fact 40074 1727204616.22426: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 40074 1727204616.22429: when evaluation is False, skipping this task 40074 1727204616.22435: _execute() done 40074 1727204616.22438: dumping result to json 40074 1727204616.22440: done dumping result, returning 40074 1727204616.22447: done running TaskExecutor() for managed-node2/TASK: Delete dummy interface ethtest0 [12b410aa-8751-9fd7-2501-000000000172] 40074 1727204616.22453: sending task result for task 12b410aa-8751-9fd7-2501-000000000172 40074 1727204616.22547: done sending task result for task 12b410aa-8751-9fd7-2501-000000000172 40074 1727204616.22550: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 40074 1727204616.22605: no more pending results, returning what we have 40074 1727204616.22609: results queue empty 40074 1727204616.22610: checking for any_errors_fatal 40074 1727204616.22614: done checking for any_errors_fatal 40074 1727204616.22615: checking for max_fail_percentage 40074 1727204616.22616: done checking for max_fail_percentage 40074 1727204616.22618: checking to see if all hosts have failed and the running result is not ok 40074 1727204616.22619: done checking to see if all hosts have failed 40074 1727204616.22620: getting the remaining hosts for this loop 40074 1727204616.22621: done getting the remaining hosts for this loop 40074 1727204616.22625: getting the next task for host managed-node2 40074 1727204616.22630: done getting next task for host managed-node2 40074 1727204616.22637: ^ task is: TASK: Create tap interface {{ interface }} 40074 1727204616.22640: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204616.22643: getting variables 40074 1727204616.22645: in VariableManager get_vars() 40074 1727204616.22682: Calling all_inventory to load vars for managed-node2 40074 1727204616.22685: Calling groups_inventory to load vars for managed-node2 40074 1727204616.22687: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.22699: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.22703: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.22707: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.22872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.23100: done with get_vars() 40074 1727204616.23108: done getting variables 40074 1727204616.23160: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204616.23269: variable 'interface' from source: set_fact TASK [Create tap interface ethtest0] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.020) 0:00:09.994 ***** 40074 1727204616.23301: entering _queue_task() for managed-node2/command 40074 1727204616.23557: worker is 1 (out of 1 available) 40074 1727204616.23572: exiting _queue_task() for managed-node2/command 40074 1727204616.23588: done queuing things up, now waiting for results queue to drain 40074 1727204616.23693: waiting for pending results... 40074 1727204616.24007: running TaskExecutor() for managed-node2/TASK: Create tap interface ethtest0 40074 1727204616.24012: in run() - task 12b410aa-8751-9fd7-2501-000000000173 40074 1727204616.24021: variable 'ansible_search_path' from source: unknown 40074 1727204616.24033: variable 'ansible_search_path' from source: unknown 40074 1727204616.24080: calling self._execute() 40074 1727204616.24197: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.24212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.24235: variable 'omit' from source: magic vars 40074 1727204616.24592: variable 'ansible_distribution_major_version' from source: facts 40074 1727204616.24601: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204616.24778: variable 'type' from source: set_fact 40074 1727204616.24782: variable 'state' from source: include params 40074 1727204616.24798: variable 'interface' from source: set_fact 40074 1727204616.24801: variable 'current_interfaces' from source: set_fact 40074 1727204616.24805: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 40074 1727204616.24808: when evaluation is False, skipping this task 40074 1727204616.24811: _execute() done 40074 1727204616.24813: dumping result to json 40074 1727204616.24818: done dumping result, returning 40074 1727204616.24824: done running TaskExecutor() for managed-node2/TASK: Create tap interface ethtest0 [12b410aa-8751-9fd7-2501-000000000173] 40074 1727204616.24830: sending task result for task 12b410aa-8751-9fd7-2501-000000000173 skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 40074 1727204616.24971: no more pending results, returning what we have 40074 1727204616.24974: results queue empty 40074 1727204616.24976: checking for any_errors_fatal 40074 1727204616.24981: done checking for any_errors_fatal 40074 1727204616.24982: checking for max_fail_percentage 40074 1727204616.24984: done checking for max_fail_percentage 40074 1727204616.24985: checking to see if all hosts have failed and the running result is not ok 40074 1727204616.24986: done checking to see if all hosts have failed 40074 1727204616.24987: getting the remaining hosts for this loop 40074 1727204616.24988: done getting the remaining hosts for this loop 40074 1727204616.24993: getting the next task for host managed-node2 40074 1727204616.24998: done getting next task for host managed-node2 40074 1727204616.25003: ^ task is: TASK: Delete tap interface {{ interface }} 40074 1727204616.25006: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204616.25009: getting variables 40074 1727204616.25012: in VariableManager get_vars() 40074 1727204616.25048: Calling all_inventory to load vars for managed-node2 40074 1727204616.25051: Calling groups_inventory to load vars for managed-node2 40074 1727204616.25054: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.25064: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.25067: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.25071: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.25236: done sending task result for task 12b410aa-8751-9fd7-2501-000000000173 40074 1727204616.25240: WORKER PROCESS EXITING 40074 1727204616.25253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.25454: done with get_vars() 40074 1727204616.25463: done getting variables 40074 1727204616.25512: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204616.25601: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest0] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.023) 0:00:10.018 ***** 40074 1727204616.25625: entering _queue_task() for managed-node2/command 40074 1727204616.25820: worker is 1 (out of 1 available) 40074 1727204616.25832: exiting _queue_task() for managed-node2/command 40074 1727204616.25843: done queuing things up, now waiting for results queue to drain 40074 1727204616.25845: waiting for pending results... 40074 1727204616.26006: running TaskExecutor() for managed-node2/TASK: Delete tap interface ethtest0 40074 1727204616.26080: in run() - task 12b410aa-8751-9fd7-2501-000000000174 40074 1727204616.26094: variable 'ansible_search_path' from source: unknown 40074 1727204616.26098: variable 'ansible_search_path' from source: unknown 40074 1727204616.26128: calling self._execute() 40074 1727204616.26203: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.26207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.26218: variable 'omit' from source: magic vars 40074 1727204616.26514: variable 'ansible_distribution_major_version' from source: facts 40074 1727204616.26593: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204616.26704: variable 'type' from source: set_fact 40074 1727204616.26707: variable 'state' from source: include params 40074 1727204616.26713: variable 'interface' from source: set_fact 40074 1727204616.26719: variable 'current_interfaces' from source: set_fact 40074 1727204616.26727: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 40074 1727204616.26730: when evaluation is False, skipping this task 40074 1727204616.26742: _execute() done 40074 1727204616.26744: dumping result to json 40074 1727204616.26747: done dumping result, returning 40074 1727204616.26752: done running TaskExecutor() for managed-node2/TASK: Delete tap interface ethtest0 [12b410aa-8751-9fd7-2501-000000000174] 40074 1727204616.26757: sending task result for task 12b410aa-8751-9fd7-2501-000000000174 40074 1727204616.26846: done sending task result for task 12b410aa-8751-9fd7-2501-000000000174 40074 1727204616.26850: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 40074 1727204616.26899: no more pending results, returning what we have 40074 1727204616.26903: results queue empty 40074 1727204616.26904: checking for any_errors_fatal 40074 1727204616.26908: done checking for any_errors_fatal 40074 1727204616.26909: checking for max_fail_percentage 40074 1727204616.26910: done checking for max_fail_percentage 40074 1727204616.26911: checking to see if all hosts have failed and the running result is not ok 40074 1727204616.26912: done checking to see if all hosts have failed 40074 1727204616.26913: getting the remaining hosts for this loop 40074 1727204616.26914: done getting the remaining hosts for this loop 40074 1727204616.26918: getting the next task for host managed-node2 40074 1727204616.26924: done getting next task for host managed-node2 40074 1727204616.26927: ^ task is: TASK: Assert device is present 40074 1727204616.26930: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204616.26934: getting variables 40074 1727204616.26935: in VariableManager get_vars() 40074 1727204616.26970: Calling all_inventory to load vars for managed-node2 40074 1727204616.26974: Calling groups_inventory to load vars for managed-node2 40074 1727204616.26976: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.26983: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.26986: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.26988: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.27180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.27373: done with get_vars() 40074 1727204616.27381: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:21 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.018) 0:00:10.036 ***** 40074 1727204616.27452: entering _queue_task() for managed-node2/include_tasks 40074 1727204616.27633: worker is 1 (out of 1 available) 40074 1727204616.27647: exiting _queue_task() for managed-node2/include_tasks 40074 1727204616.27660: done queuing things up, now waiting for results queue to drain 40074 1727204616.27662: waiting for pending results... 40074 1727204616.27809: running TaskExecutor() for managed-node2/TASK: Assert device is present 40074 1727204616.27873: in run() - task 12b410aa-8751-9fd7-2501-00000000000e 40074 1727204616.27885: variable 'ansible_search_path' from source: unknown 40074 1727204616.27916: calling self._execute() 40074 1727204616.27986: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.27996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.28009: variable 'omit' from source: magic vars 40074 1727204616.28285: variable 'ansible_distribution_major_version' from source: facts 40074 1727204616.28296: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204616.28303: _execute() done 40074 1727204616.28306: dumping result to json 40074 1727204616.28311: done dumping result, returning 40074 1727204616.28319: done running TaskExecutor() for managed-node2/TASK: Assert device is present [12b410aa-8751-9fd7-2501-00000000000e] 40074 1727204616.28322: sending task result for task 12b410aa-8751-9fd7-2501-00000000000e 40074 1727204616.28417: done sending task result for task 12b410aa-8751-9fd7-2501-00000000000e 40074 1727204616.28420: WORKER PROCESS EXITING 40074 1727204616.28467: no more pending results, returning what we have 40074 1727204616.28471: in VariableManager get_vars() 40074 1727204616.28515: Calling all_inventory to load vars for managed-node2 40074 1727204616.28518: Calling groups_inventory to load vars for managed-node2 40074 1727204616.28521: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.28531: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.28534: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.28536: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.28698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.28891: done with get_vars() 40074 1727204616.28899: variable 'ansible_search_path' from source: unknown 40074 1727204616.28909: we have included files to process 40074 1727204616.28909: generating all_blocks data 40074 1727204616.28911: done generating all_blocks data 40074 1727204616.28914: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 40074 1727204616.28915: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 40074 1727204616.28917: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 40074 1727204616.29038: in VariableManager get_vars() 40074 1727204616.29055: done with get_vars() 40074 1727204616.29141: done processing included file 40074 1727204616.29143: iterating over new_blocks loaded from include file 40074 1727204616.29144: in VariableManager get_vars() 40074 1727204616.29183: done with get_vars() 40074 1727204616.29185: filtering new block on tags 40074 1727204616.29202: done filtering new block on tags 40074 1727204616.29204: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 40074 1727204616.29207: extending task lists for all hosts with included blocks 40074 1727204616.29697: done extending task lists 40074 1727204616.29698: done processing included files 40074 1727204616.29699: results queue empty 40074 1727204616.29700: checking for any_errors_fatal 40074 1727204616.29702: done checking for any_errors_fatal 40074 1727204616.29702: checking for max_fail_percentage 40074 1727204616.29703: done checking for max_fail_percentage 40074 1727204616.29704: checking to see if all hosts have failed and the running result is not ok 40074 1727204616.29704: done checking to see if all hosts have failed 40074 1727204616.29705: getting the remaining hosts for this loop 40074 1727204616.29706: done getting the remaining hosts for this loop 40074 1727204616.29708: getting the next task for host managed-node2 40074 1727204616.29711: done getting next task for host managed-node2 40074 1727204616.29712: ^ task is: TASK: Include the task 'get_interface_stat.yml' 40074 1727204616.29714: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204616.29715: getting variables 40074 1727204616.29716: in VariableManager get_vars() 40074 1727204616.29727: Calling all_inventory to load vars for managed-node2 40074 1727204616.29730: Calling groups_inventory to load vars for managed-node2 40074 1727204616.29732: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.29737: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.29739: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.29741: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.29872: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.30073: done with get_vars() 40074 1727204616.30081: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.026) 0:00:10.063 ***** 40074 1727204616.30136: entering _queue_task() for managed-node2/include_tasks 40074 1727204616.30329: worker is 1 (out of 1 available) 40074 1727204616.30341: exiting _queue_task() for managed-node2/include_tasks 40074 1727204616.30354: done queuing things up, now waiting for results queue to drain 40074 1727204616.30355: waiting for pending results... 40074 1727204616.30517: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 40074 1727204616.30588: in run() - task 12b410aa-8751-9fd7-2501-000000000214 40074 1727204616.30597: variable 'ansible_search_path' from source: unknown 40074 1727204616.30604: variable 'ansible_search_path' from source: unknown 40074 1727204616.30631: calling self._execute() 40074 1727204616.30710: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.30717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.30726: variable 'omit' from source: magic vars 40074 1727204616.31022: variable 'ansible_distribution_major_version' from source: facts 40074 1727204616.31094: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204616.31097: _execute() done 40074 1727204616.31100: dumping result to json 40074 1727204616.31102: done dumping result, returning 40074 1727204616.31104: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-9fd7-2501-000000000214] 40074 1727204616.31106: sending task result for task 12b410aa-8751-9fd7-2501-000000000214 40074 1727204616.31175: done sending task result for task 12b410aa-8751-9fd7-2501-000000000214 40074 1727204616.31177: WORKER PROCESS EXITING 40074 1727204616.31204: no more pending results, returning what we have 40074 1727204616.31208: in VariableManager get_vars() 40074 1727204616.31250: Calling all_inventory to load vars for managed-node2 40074 1727204616.31254: Calling groups_inventory to load vars for managed-node2 40074 1727204616.31256: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.31265: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.31267: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.31269: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.31429: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.31623: done with get_vars() 40074 1727204616.31629: variable 'ansible_search_path' from source: unknown 40074 1727204616.31632: variable 'ansible_search_path' from source: unknown 40074 1727204616.31660: we have included files to process 40074 1727204616.31661: generating all_blocks data 40074 1727204616.31662: done generating all_blocks data 40074 1727204616.31663: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 40074 1727204616.31664: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 40074 1727204616.31665: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 40074 1727204616.31850: done processing included file 40074 1727204616.31852: iterating over new_blocks loaded from include file 40074 1727204616.31853: in VariableManager get_vars() 40074 1727204616.31867: done with get_vars() 40074 1727204616.31868: filtering new block on tags 40074 1727204616.31880: done filtering new block on tags 40074 1727204616.31882: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 40074 1727204616.31886: extending task lists for all hosts with included blocks 40074 1727204616.31969: done extending task lists 40074 1727204616.31970: done processing included files 40074 1727204616.31971: results queue empty 40074 1727204616.31971: checking for any_errors_fatal 40074 1727204616.31974: done checking for any_errors_fatal 40074 1727204616.31975: checking for max_fail_percentage 40074 1727204616.31975: done checking for max_fail_percentage 40074 1727204616.31976: checking to see if all hosts have failed and the running result is not ok 40074 1727204616.31977: done checking to see if all hosts have failed 40074 1727204616.31977: getting the remaining hosts for this loop 40074 1727204616.31978: done getting the remaining hosts for this loop 40074 1727204616.31980: getting the next task for host managed-node2 40074 1727204616.31983: done getting next task for host managed-node2 40074 1727204616.31985: ^ task is: TASK: Get stat for interface {{ interface }} 40074 1727204616.31987: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204616.31991: getting variables 40074 1727204616.31991: in VariableManager get_vars() 40074 1727204616.32001: Calling all_inventory to load vars for managed-node2 40074 1727204616.32003: Calling groups_inventory to load vars for managed-node2 40074 1727204616.32005: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.32009: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.32010: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.32013: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.32171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.32360: done with get_vars() 40074 1727204616.32368: done getting variables 40074 1727204616.32491: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.023) 0:00:10.086 ***** 40074 1727204616.32514: entering _queue_task() for managed-node2/stat 40074 1727204616.32712: worker is 1 (out of 1 available) 40074 1727204616.32727: exiting _queue_task() for managed-node2/stat 40074 1727204616.32742: done queuing things up, now waiting for results queue to drain 40074 1727204616.32744: waiting for pending results... 40074 1727204616.32901: running TaskExecutor() for managed-node2/TASK: Get stat for interface ethtest0 40074 1727204616.32983: in run() - task 12b410aa-8751-9fd7-2501-000000000267 40074 1727204616.32993: variable 'ansible_search_path' from source: unknown 40074 1727204616.32997: variable 'ansible_search_path' from source: unknown 40074 1727204616.33029: calling self._execute() 40074 1727204616.33099: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.33107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.33116: variable 'omit' from source: magic vars 40074 1727204616.33412: variable 'ansible_distribution_major_version' from source: facts 40074 1727204616.33420: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204616.33427: variable 'omit' from source: magic vars 40074 1727204616.33464: variable 'omit' from source: magic vars 40074 1727204616.33546: variable 'interface' from source: set_fact 40074 1727204616.33560: variable 'omit' from source: magic vars 40074 1727204616.33598: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204616.33630: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204616.33649: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204616.33665: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204616.33677: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204616.33704: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204616.33708: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.33711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.33800: Set connection var ansible_pipelining to False 40074 1727204616.33806: Set connection var ansible_shell_executable to /bin/sh 40074 1727204616.33810: Set connection var ansible_shell_type to sh 40074 1727204616.33812: Set connection var ansible_connection to ssh 40074 1727204616.33820: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204616.33826: Set connection var ansible_timeout to 10 40074 1727204616.33852: variable 'ansible_shell_executable' from source: unknown 40074 1727204616.33856: variable 'ansible_connection' from source: unknown 40074 1727204616.33859: variable 'ansible_module_compression' from source: unknown 40074 1727204616.33862: variable 'ansible_shell_type' from source: unknown 40074 1727204616.33864: variable 'ansible_shell_executable' from source: unknown 40074 1727204616.33870: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.33875: variable 'ansible_pipelining' from source: unknown 40074 1727204616.33877: variable 'ansible_timeout' from source: unknown 40074 1727204616.33883: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.34054: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204616.34065: variable 'omit' from source: magic vars 40074 1727204616.34071: starting attempt loop 40074 1727204616.34074: running the handler 40074 1727204616.34089: _low_level_execute_command(): starting 40074 1727204616.34099: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204616.34647: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204616.34651: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204616.34655: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.34694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204616.34710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204616.34766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204616.36516: stdout chunk (state=3): >>>/root <<< 40074 1727204616.36624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204616.36675: stderr chunk (state=3): >>><<< 40074 1727204616.36680: stdout chunk (state=3): >>><<< 40074 1727204616.36705: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204616.36716: _low_level_execute_command(): starting 40074 1727204616.36722: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019 `" && echo ansible-tmp-1727204616.3670428-40666-144654905705019="` echo /root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019 `" ) && sleep 0' 40074 1727204616.37184: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204616.37187: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204616.37192: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204616.37202: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204616.37205: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.37260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204616.37263: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204616.37299: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204616.39323: stdout chunk (state=3): >>>ansible-tmp-1727204616.3670428-40666-144654905705019=/root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019 <<< 40074 1727204616.39442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204616.39493: stderr chunk (state=3): >>><<< 40074 1727204616.39497: stdout chunk (state=3): >>><<< 40074 1727204616.39513: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204616.3670428-40666-144654905705019=/root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204616.39560: variable 'ansible_module_compression' from source: unknown 40074 1727204616.39605: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 40074 1727204616.39641: variable 'ansible_facts' from source: unknown 40074 1727204616.39702: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019/AnsiballZ_stat.py 40074 1727204616.39816: Sending initial data 40074 1727204616.39820: Sent initial data (153 bytes) 40074 1727204616.40293: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204616.40299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204616.40301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204616.40304: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204616.40306: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.40358: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204616.40364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204616.40405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204616.42080: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 40074 1727204616.42084: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204616.42116: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204616.42160: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpyt4p21pz /root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019/AnsiballZ_stat.py <<< 40074 1727204616.42163: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019/AnsiballZ_stat.py" <<< 40074 1727204616.42199: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpyt4p21pz" to remote "/root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019/AnsiballZ_stat.py" <<< 40074 1727204616.43006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204616.43037: stderr chunk (state=3): >>><<< 40074 1727204616.43044: stdout chunk (state=3): >>><<< 40074 1727204616.43064: done transferring module to remote 40074 1727204616.43074: _low_level_execute_command(): starting 40074 1727204616.43080: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019/ /root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019/AnsiballZ_stat.py && sleep 0' 40074 1727204616.43536: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204616.43540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204616.43542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 40074 1727204616.43545: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204616.43547: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.43605: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204616.43611: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204616.43651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204616.45618: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204616.45659: stderr chunk (state=3): >>><<< 40074 1727204616.45662: stdout chunk (state=3): >>><<< 40074 1727204616.45679: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204616.45682: _low_level_execute_command(): starting 40074 1727204616.45688: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019/AnsiballZ_stat.py && sleep 0' 40074 1727204616.46167: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204616.46171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204616.46174: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.46176: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204616.46178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.46230: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204616.46233: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204616.46284: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204616.64003: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 39589, "dev": 23, "nlink": 1, "atime": 1727204614.8375576, "mtime": 1727204614.8375576, "ctime": 1727204614.8375576, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 40074 1727204616.65492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204616.65552: stderr chunk (state=3): >>><<< 40074 1727204616.65556: stdout chunk (state=3): >>><<< 40074 1727204616.65575: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest0", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 39589, "dev": 23, "nlink": 1, "atime": 1727204614.8375576, "mtime": 1727204614.8375576, "ctime": 1727204614.8375576, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204616.65633: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204616.65645: _low_level_execute_command(): starting 40074 1727204616.65651: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204616.3670428-40666-144654905705019/ > /dev/null 2>&1 && sleep 0' 40074 1727204616.66156: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204616.66160: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.66162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204616.66164: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204616.66167: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.66220: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204616.66223: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204616.66267: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204616.68200: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204616.68251: stderr chunk (state=3): >>><<< 40074 1727204616.68255: stdout chunk (state=3): >>><<< 40074 1727204616.68268: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204616.68276: handler run complete 40074 1727204616.68321: attempt loop complete, returning result 40074 1727204616.68324: _execute() done 40074 1727204616.68330: dumping result to json 40074 1727204616.68341: done dumping result, returning 40074 1727204616.68349: done running TaskExecutor() for managed-node2/TASK: Get stat for interface ethtest0 [12b410aa-8751-9fd7-2501-000000000267] 40074 1727204616.68354: sending task result for task 12b410aa-8751-9fd7-2501-000000000267 40074 1727204616.68468: done sending task result for task 12b410aa-8751-9fd7-2501-000000000267 40074 1727204616.68471: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204614.8375576, "block_size": 4096, "blocks": 0, "ctime": 1727204614.8375576, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 39589, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest0", "lnk_target": "../../devices/virtual/net/ethtest0", "mode": "0777", "mtime": 1727204614.8375576, "nlink": 1, "path": "/sys/class/net/ethtest0", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 40074 1727204616.68585: no more pending results, returning what we have 40074 1727204616.68588: results queue empty 40074 1727204616.68598: checking for any_errors_fatal 40074 1727204616.68600: done checking for any_errors_fatal 40074 1727204616.68601: checking for max_fail_percentage 40074 1727204616.68603: done checking for max_fail_percentage 40074 1727204616.68604: checking to see if all hosts have failed and the running result is not ok 40074 1727204616.68605: done checking to see if all hosts have failed 40074 1727204616.68606: getting the remaining hosts for this loop 40074 1727204616.68607: done getting the remaining hosts for this loop 40074 1727204616.68611: getting the next task for host managed-node2 40074 1727204616.68619: done getting next task for host managed-node2 40074 1727204616.68622: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 40074 1727204616.68625: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204616.68628: getting variables 40074 1727204616.68630: in VariableManager get_vars() 40074 1727204616.68668: Calling all_inventory to load vars for managed-node2 40074 1727204616.68671: Calling groups_inventory to load vars for managed-node2 40074 1727204616.68673: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.68685: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.68688: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.68694: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.68900: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.69107: done with get_vars() 40074 1727204616.69116: done getting variables 40074 1727204616.69200: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 40074 1727204616.69299: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest0'] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.368) 0:00:10.455 ***** 40074 1727204616.69323: entering _queue_task() for managed-node2/assert 40074 1727204616.69324: Creating lock for assert 40074 1727204616.69540: worker is 1 (out of 1 available) 40074 1727204616.69554: exiting _queue_task() for managed-node2/assert 40074 1727204616.69567: done queuing things up, now waiting for results queue to drain 40074 1727204616.69569: waiting for pending results... 40074 1727204616.69744: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'ethtest0' 40074 1727204616.69813: in run() - task 12b410aa-8751-9fd7-2501-000000000215 40074 1727204616.69827: variable 'ansible_search_path' from source: unknown 40074 1727204616.69831: variable 'ansible_search_path' from source: unknown 40074 1727204616.69863: calling self._execute() 40074 1727204616.69943: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.69949: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.69959: variable 'omit' from source: magic vars 40074 1727204616.70262: variable 'ansible_distribution_major_version' from source: facts 40074 1727204616.70272: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204616.70279: variable 'omit' from source: magic vars 40074 1727204616.70311: variable 'omit' from source: magic vars 40074 1727204616.70395: variable 'interface' from source: set_fact 40074 1727204616.70411: variable 'omit' from source: magic vars 40074 1727204616.70449: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204616.70480: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204616.70499: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204616.70516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204616.70527: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204616.70557: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204616.70562: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.70565: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.70656: Set connection var ansible_pipelining to False 40074 1727204616.70662: Set connection var ansible_shell_executable to /bin/sh 40074 1727204616.70665: Set connection var ansible_shell_type to sh 40074 1727204616.70668: Set connection var ansible_connection to ssh 40074 1727204616.70677: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204616.70688: Set connection var ansible_timeout to 10 40074 1727204616.70708: variable 'ansible_shell_executable' from source: unknown 40074 1727204616.70712: variable 'ansible_connection' from source: unknown 40074 1727204616.70714: variable 'ansible_module_compression' from source: unknown 40074 1727204616.70719: variable 'ansible_shell_type' from source: unknown 40074 1727204616.70721: variable 'ansible_shell_executable' from source: unknown 40074 1727204616.70726: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.70731: variable 'ansible_pipelining' from source: unknown 40074 1727204616.70737: variable 'ansible_timeout' from source: unknown 40074 1727204616.70742: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.70860: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204616.70870: variable 'omit' from source: magic vars 40074 1727204616.70876: starting attempt loop 40074 1727204616.70879: running the handler 40074 1727204616.70994: variable 'interface_stat' from source: set_fact 40074 1727204616.71018: Evaluated conditional (interface_stat.stat.exists): True 40074 1727204616.71023: handler run complete 40074 1727204616.71039: attempt loop complete, returning result 40074 1727204616.71043: _execute() done 40074 1727204616.71045: dumping result to json 40074 1727204616.71050: done dumping result, returning 40074 1727204616.71057: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'ethtest0' [12b410aa-8751-9fd7-2501-000000000215] 40074 1727204616.71062: sending task result for task 12b410aa-8751-9fd7-2501-000000000215 40074 1727204616.71149: done sending task result for task 12b410aa-8751-9fd7-2501-000000000215 40074 1727204616.71152: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 40074 1727204616.71206: no more pending results, returning what we have 40074 1727204616.71210: results queue empty 40074 1727204616.71211: checking for any_errors_fatal 40074 1727204616.71217: done checking for any_errors_fatal 40074 1727204616.71218: checking for max_fail_percentage 40074 1727204616.71219: done checking for max_fail_percentage 40074 1727204616.71220: checking to see if all hosts have failed and the running result is not ok 40074 1727204616.71221: done checking to see if all hosts have failed 40074 1727204616.71222: getting the remaining hosts for this loop 40074 1727204616.71223: done getting the remaining hosts for this loop 40074 1727204616.71227: getting the next task for host managed-node2 40074 1727204616.71233: done getting next task for host managed-node2 40074 1727204616.71236: ^ task is: TASK: Set interface1 40074 1727204616.71238: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204616.71242: getting variables 40074 1727204616.71243: in VariableManager get_vars() 40074 1727204616.71286: Calling all_inventory to load vars for managed-node2 40074 1727204616.71290: Calling groups_inventory to load vars for managed-node2 40074 1727204616.71293: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.71302: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.71304: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.71307: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.71467: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.71663: done with get_vars() 40074 1727204616.71671: done getting variables 40074 1727204616.71720: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set interface1] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:23 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.024) 0:00:10.479 ***** 40074 1727204616.71742: entering _queue_task() for managed-node2/set_fact 40074 1727204616.71931: worker is 1 (out of 1 available) 40074 1727204616.71947: exiting _queue_task() for managed-node2/set_fact 40074 1727204616.71959: done queuing things up, now waiting for results queue to drain 40074 1727204616.71961: waiting for pending results... 40074 1727204616.72118: running TaskExecutor() for managed-node2/TASK: Set interface1 40074 1727204616.72175: in run() - task 12b410aa-8751-9fd7-2501-00000000000f 40074 1727204616.72192: variable 'ansible_search_path' from source: unknown 40074 1727204616.72221: calling self._execute() 40074 1727204616.72300: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.72306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.72314: variable 'omit' from source: magic vars 40074 1727204616.72599: variable 'ansible_distribution_major_version' from source: facts 40074 1727204616.72609: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204616.72615: variable 'omit' from source: magic vars 40074 1727204616.72643: variable 'omit' from source: magic vars 40074 1727204616.72666: variable 'interface1' from source: play vars 40074 1727204616.72730: variable 'interface1' from source: play vars 40074 1727204616.72749: variable 'omit' from source: magic vars 40074 1727204616.72781: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204616.72811: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204616.72829: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204616.72849: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204616.72862: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204616.72887: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204616.72892: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.72897: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.72984: Set connection var ansible_pipelining to False 40074 1727204616.72995: Set connection var ansible_shell_executable to /bin/sh 40074 1727204616.72998: Set connection var ansible_shell_type to sh 40074 1727204616.73001: Set connection var ansible_connection to ssh 40074 1727204616.73006: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204616.73013: Set connection var ansible_timeout to 10 40074 1727204616.73035: variable 'ansible_shell_executable' from source: unknown 40074 1727204616.73041: variable 'ansible_connection' from source: unknown 40074 1727204616.73044: variable 'ansible_module_compression' from source: unknown 40074 1727204616.73048: variable 'ansible_shell_type' from source: unknown 40074 1727204616.73051: variable 'ansible_shell_executable' from source: unknown 40074 1727204616.73055: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.73065: variable 'ansible_pipelining' from source: unknown 40074 1727204616.73070: variable 'ansible_timeout' from source: unknown 40074 1727204616.73072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.73244: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204616.73254: variable 'omit' from source: magic vars 40074 1727204616.73260: starting attempt loop 40074 1727204616.73263: running the handler 40074 1727204616.73274: handler run complete 40074 1727204616.73285: attempt loop complete, returning result 40074 1727204616.73288: _execute() done 40074 1727204616.73296: dumping result to json 40074 1727204616.73299: done dumping result, returning 40074 1727204616.73306: done running TaskExecutor() for managed-node2/TASK: Set interface1 [12b410aa-8751-9fd7-2501-00000000000f] 40074 1727204616.73311: sending task result for task 12b410aa-8751-9fd7-2501-00000000000f 40074 1727204616.73391: done sending task result for task 12b410aa-8751-9fd7-2501-00000000000f 40074 1727204616.73394: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "interface": "ethtest1" }, "changed": false } 40074 1727204616.73456: no more pending results, returning what we have 40074 1727204616.73459: results queue empty 40074 1727204616.73461: checking for any_errors_fatal 40074 1727204616.73466: done checking for any_errors_fatal 40074 1727204616.73466: checking for max_fail_percentage 40074 1727204616.73468: done checking for max_fail_percentage 40074 1727204616.73469: checking to see if all hosts have failed and the running result is not ok 40074 1727204616.73470: done checking to see if all hosts have failed 40074 1727204616.73471: getting the remaining hosts for this loop 40074 1727204616.73472: done getting the remaining hosts for this loop 40074 1727204616.73476: getting the next task for host managed-node2 40074 1727204616.73481: done getting next task for host managed-node2 40074 1727204616.73483: ^ task is: TASK: Show interfaces 40074 1727204616.73485: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204616.73491: getting variables 40074 1727204616.73492: in VariableManager get_vars() 40074 1727204616.73528: Calling all_inventory to load vars for managed-node2 40074 1727204616.73531: Calling groups_inventory to load vars for managed-node2 40074 1727204616.73534: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.73543: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.73545: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.73547: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.73735: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.73926: done with get_vars() 40074 1727204616.73935: done getting variables TASK [Show interfaces] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:26 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.022) 0:00:10.501 ***** 40074 1727204616.74005: entering _queue_task() for managed-node2/include_tasks 40074 1727204616.74193: worker is 1 (out of 1 available) 40074 1727204616.74207: exiting _queue_task() for managed-node2/include_tasks 40074 1727204616.74219: done queuing things up, now waiting for results queue to drain 40074 1727204616.74221: waiting for pending results... 40074 1727204616.74375: running TaskExecutor() for managed-node2/TASK: Show interfaces 40074 1727204616.74434: in run() - task 12b410aa-8751-9fd7-2501-000000000010 40074 1727204616.74450: variable 'ansible_search_path' from source: unknown 40074 1727204616.74485: calling self._execute() 40074 1727204616.74563: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.74572: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.74581: variable 'omit' from source: magic vars 40074 1727204616.74875: variable 'ansible_distribution_major_version' from source: facts 40074 1727204616.74886: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204616.74897: _execute() done 40074 1727204616.74901: dumping result to json 40074 1727204616.74906: done dumping result, returning 40074 1727204616.74913: done running TaskExecutor() for managed-node2/TASK: Show interfaces [12b410aa-8751-9fd7-2501-000000000010] 40074 1727204616.74918: sending task result for task 12b410aa-8751-9fd7-2501-000000000010 40074 1727204616.75007: done sending task result for task 12b410aa-8751-9fd7-2501-000000000010 40074 1727204616.75010: WORKER PROCESS EXITING 40074 1727204616.75043: no more pending results, returning what we have 40074 1727204616.75047: in VariableManager get_vars() 40074 1727204616.75087: Calling all_inventory to load vars for managed-node2 40074 1727204616.75092: Calling groups_inventory to load vars for managed-node2 40074 1727204616.75095: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.75106: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.75109: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.75113: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.75279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.75469: done with get_vars() 40074 1727204616.75475: variable 'ansible_search_path' from source: unknown 40074 1727204616.75486: we have included files to process 40074 1727204616.75487: generating all_blocks data 40074 1727204616.75491: done generating all_blocks data 40074 1727204616.75495: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 40074 1727204616.75495: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 40074 1727204616.75497: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 40074 1727204616.75577: in VariableManager get_vars() 40074 1727204616.75596: done with get_vars() 40074 1727204616.75683: done processing included file 40074 1727204616.75685: iterating over new_blocks loaded from include file 40074 1727204616.75686: in VariableManager get_vars() 40074 1727204616.75724: done with get_vars() 40074 1727204616.75725: filtering new block on tags 40074 1727204616.75740: done filtering new block on tags 40074 1727204616.75742: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 40074 1727204616.75745: extending task lists for all hosts with included blocks 40074 1727204616.76306: done extending task lists 40074 1727204616.76307: done processing included files 40074 1727204616.76307: results queue empty 40074 1727204616.76308: checking for any_errors_fatal 40074 1727204616.76310: done checking for any_errors_fatal 40074 1727204616.76310: checking for max_fail_percentage 40074 1727204616.76311: done checking for max_fail_percentage 40074 1727204616.76312: checking to see if all hosts have failed and the running result is not ok 40074 1727204616.76313: done checking to see if all hosts have failed 40074 1727204616.76313: getting the remaining hosts for this loop 40074 1727204616.76314: done getting the remaining hosts for this loop 40074 1727204616.76316: getting the next task for host managed-node2 40074 1727204616.76318: done getting next task for host managed-node2 40074 1727204616.76320: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 40074 1727204616.76322: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204616.76323: getting variables 40074 1727204616.76324: in VariableManager get_vars() 40074 1727204616.76336: Calling all_inventory to load vars for managed-node2 40074 1727204616.76338: Calling groups_inventory to load vars for managed-node2 40074 1727204616.76339: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.76343: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.76346: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.76349: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.76482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.76683: done with get_vars() 40074 1727204616.76692: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.027) 0:00:10.529 ***** 40074 1727204616.76745: entering _queue_task() for managed-node2/include_tasks 40074 1727204616.76936: worker is 1 (out of 1 available) 40074 1727204616.76951: exiting _queue_task() for managed-node2/include_tasks 40074 1727204616.76964: done queuing things up, now waiting for results queue to drain 40074 1727204616.76965: waiting for pending results... 40074 1727204616.77119: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 40074 1727204616.77183: in run() - task 12b410aa-8751-9fd7-2501-000000000282 40074 1727204616.77205: variable 'ansible_search_path' from source: unknown 40074 1727204616.77209: variable 'ansible_search_path' from source: unknown 40074 1727204616.77231: calling self._execute() 40074 1727204616.77308: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.77317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.77325: variable 'omit' from source: magic vars 40074 1727204616.77619: variable 'ansible_distribution_major_version' from source: facts 40074 1727204616.77629: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204616.77639: _execute() done 40074 1727204616.77643: dumping result to json 40074 1727204616.77646: done dumping result, returning 40074 1727204616.77652: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-9fd7-2501-000000000282] 40074 1727204616.77658: sending task result for task 12b410aa-8751-9fd7-2501-000000000282 40074 1727204616.77745: done sending task result for task 12b410aa-8751-9fd7-2501-000000000282 40074 1727204616.77748: WORKER PROCESS EXITING 40074 1727204616.77778: no more pending results, returning what we have 40074 1727204616.77782: in VariableManager get_vars() 40074 1727204616.77824: Calling all_inventory to load vars for managed-node2 40074 1727204616.77827: Calling groups_inventory to load vars for managed-node2 40074 1727204616.77829: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.77839: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.77842: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.77846: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.78013: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.78204: done with get_vars() 40074 1727204616.78212: variable 'ansible_search_path' from source: unknown 40074 1727204616.78212: variable 'ansible_search_path' from source: unknown 40074 1727204616.78240: we have included files to process 40074 1727204616.78241: generating all_blocks data 40074 1727204616.78242: done generating all_blocks data 40074 1727204616.78243: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 40074 1727204616.78244: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 40074 1727204616.78245: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 40074 1727204616.78451: done processing included file 40074 1727204616.78452: iterating over new_blocks loaded from include file 40074 1727204616.78454: in VariableManager get_vars() 40074 1727204616.78467: done with get_vars() 40074 1727204616.78469: filtering new block on tags 40074 1727204616.78482: done filtering new block on tags 40074 1727204616.78484: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 40074 1727204616.78487: extending task lists for all hosts with included blocks 40074 1727204616.78569: done extending task lists 40074 1727204616.78570: done processing included files 40074 1727204616.78570: results queue empty 40074 1727204616.78571: checking for any_errors_fatal 40074 1727204616.78573: done checking for any_errors_fatal 40074 1727204616.78574: checking for max_fail_percentage 40074 1727204616.78574: done checking for max_fail_percentage 40074 1727204616.78575: checking to see if all hosts have failed and the running result is not ok 40074 1727204616.78576: done checking to see if all hosts have failed 40074 1727204616.78576: getting the remaining hosts for this loop 40074 1727204616.78577: done getting the remaining hosts for this loop 40074 1727204616.78579: getting the next task for host managed-node2 40074 1727204616.78582: done getting next task for host managed-node2 40074 1727204616.78583: ^ task is: TASK: Gather current interface info 40074 1727204616.78586: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204616.78587: getting variables 40074 1727204616.78588: in VariableManager get_vars() 40074 1727204616.78600: Calling all_inventory to load vars for managed-node2 40074 1727204616.78601: Calling groups_inventory to load vars for managed-node2 40074 1727204616.78603: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204616.78606: Calling all_plugins_play to load vars for managed-node2 40074 1727204616.78608: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204616.78610: Calling groups_plugins_play to load vars for managed-node2 40074 1727204616.78767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204616.78952: done with get_vars() 40074 1727204616.78960: done getting variables 40074 1727204616.78994: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:36 -0400 (0:00:00.022) 0:00:10.551 ***** 40074 1727204616.79017: entering _queue_task() for managed-node2/command 40074 1727204616.79210: worker is 1 (out of 1 available) 40074 1727204616.79225: exiting _queue_task() for managed-node2/command 40074 1727204616.79236: done queuing things up, now waiting for results queue to drain 40074 1727204616.79238: waiting for pending results... 40074 1727204616.79393: running TaskExecutor() for managed-node2/TASK: Gather current interface info 40074 1727204616.79479: in run() - task 12b410aa-8751-9fd7-2501-0000000002e0 40074 1727204616.79492: variable 'ansible_search_path' from source: unknown 40074 1727204616.79496: variable 'ansible_search_path' from source: unknown 40074 1727204616.79527: calling self._execute() 40074 1727204616.79603: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.79609: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.79619: variable 'omit' from source: magic vars 40074 1727204616.79910: variable 'ansible_distribution_major_version' from source: facts 40074 1727204616.79921: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204616.79928: variable 'omit' from source: magic vars 40074 1727204616.79970: variable 'omit' from source: magic vars 40074 1727204616.80001: variable 'omit' from source: magic vars 40074 1727204616.80038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204616.80069: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204616.80087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204616.80105: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204616.80116: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204616.80148: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204616.80151: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.80154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.80245: Set connection var ansible_pipelining to False 40074 1727204616.80249: Set connection var ansible_shell_executable to /bin/sh 40074 1727204616.80252: Set connection var ansible_shell_type to sh 40074 1727204616.80256: Set connection var ansible_connection to ssh 40074 1727204616.80263: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204616.80270: Set connection var ansible_timeout to 10 40074 1727204616.80292: variable 'ansible_shell_executable' from source: unknown 40074 1727204616.80297: variable 'ansible_connection' from source: unknown 40074 1727204616.80300: variable 'ansible_module_compression' from source: unknown 40074 1727204616.80303: variable 'ansible_shell_type' from source: unknown 40074 1727204616.80308: variable 'ansible_shell_executable' from source: unknown 40074 1727204616.80311: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204616.80317: variable 'ansible_pipelining' from source: unknown 40074 1727204616.80319: variable 'ansible_timeout' from source: unknown 40074 1727204616.80325: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204616.80442: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204616.80452: variable 'omit' from source: magic vars 40074 1727204616.80464: starting attempt loop 40074 1727204616.80468: running the handler 40074 1727204616.80482: _low_level_execute_command(): starting 40074 1727204616.80490: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204616.81042: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204616.81046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204616.81049: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204616.81052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.81095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204616.81115: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204616.81160: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204616.82928: stdout chunk (state=3): >>>/root <<< 40074 1727204616.83038: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204616.83092: stderr chunk (state=3): >>><<< 40074 1727204616.83096: stdout chunk (state=3): >>><<< 40074 1727204616.83116: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204616.83128: _low_level_execute_command(): starting 40074 1727204616.83135: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768 `" && echo ansible-tmp-1727204616.8311572-40678-110748859189768="` echo /root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768 `" ) && sleep 0' 40074 1727204616.83587: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204616.83591: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.83594: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204616.83613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.83656: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204616.83662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204616.83703: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204616.85747: stdout chunk (state=3): >>>ansible-tmp-1727204616.8311572-40678-110748859189768=/root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768 <<< 40074 1727204616.85866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204616.85912: stderr chunk (state=3): >>><<< 40074 1727204616.85915: stdout chunk (state=3): >>><<< 40074 1727204616.85929: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204616.8311572-40678-110748859189768=/root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204616.85958: variable 'ansible_module_compression' from source: unknown 40074 1727204616.86004: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204616.86034: variable 'ansible_facts' from source: unknown 40074 1727204616.86102: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768/AnsiballZ_command.py 40074 1727204616.86210: Sending initial data 40074 1727204616.86213: Sent initial data (156 bytes) 40074 1727204616.86673: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204616.86676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204616.86679: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.86682: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204616.86685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.86736: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204616.86739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204616.86784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204616.88439: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 40074 1727204616.88443: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204616.88473: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204616.88511: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpx_k41ru4 /root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768/AnsiballZ_command.py <<< 40074 1727204616.88520: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768/AnsiballZ_command.py" <<< 40074 1727204616.88548: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 40074 1727204616.88551: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpx_k41ru4" to remote "/root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768/AnsiballZ_command.py" <<< 40074 1727204616.89324: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204616.89392: stderr chunk (state=3): >>><<< 40074 1727204616.89396: stdout chunk (state=3): >>><<< 40074 1727204616.89418: done transferring module to remote 40074 1727204616.89428: _low_level_execute_command(): starting 40074 1727204616.89436: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768/ /root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768/AnsiballZ_command.py && sleep 0' 40074 1727204616.89895: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204616.89899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204616.89901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204616.89903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204616.89915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.89961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204616.89979: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204616.90015: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204616.91953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204616.92007: stderr chunk (state=3): >>><<< 40074 1727204616.92011: stdout chunk (state=3): >>><<< 40074 1727204616.92025: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204616.92028: _low_level_execute_command(): starting 40074 1727204616.92035: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768/AnsiballZ_command.py && sleep 0' 40074 1727204616.92495: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204616.92498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.92501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204616.92503: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204616.92512: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204616.92554: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204616.92568: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204616.92619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204617.10588: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:37.101288", "end": "2024-09-24 15:03:37.104889", "delta": "0:00:00.003601", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204617.12388: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204617.12453: stderr chunk (state=3): >>><<< 40074 1727204617.12459: stdout chunk (state=3): >>><<< 40074 1727204617.12473: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:37.101288", "end": "2024-09-24 15:03:37.104889", "delta": "0:00:00.003601", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204617.12513: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204617.12522: _low_level_execute_command(): starting 40074 1727204617.12528: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204616.8311572-40678-110748859189768/ > /dev/null 2>&1 && sleep 0' 40074 1727204617.13001: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204617.13027: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.13034: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.13081: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204617.13085: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204617.13133: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204617.15087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204617.15141: stderr chunk (state=3): >>><<< 40074 1727204617.15146: stdout chunk (state=3): >>><<< 40074 1727204617.15160: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204617.15169: handler run complete 40074 1727204617.15192: Evaluated conditional (False): False 40074 1727204617.15203: attempt loop complete, returning result 40074 1727204617.15206: _execute() done 40074 1727204617.15211: dumping result to json 40074 1727204617.15217: done dumping result, returning 40074 1727204617.15226: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [12b410aa-8751-9fd7-2501-0000000002e0] 40074 1727204617.15233: sending task result for task 12b410aa-8751-9fd7-2501-0000000002e0 40074 1727204617.15344: done sending task result for task 12b410aa-8751-9fd7-2501-0000000002e0 40074 1727204617.15347: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003601", "end": "2024-09-24 15:03:37.104889", "rc": 0, "start": "2024-09-24 15:03:37.101288" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 rpltstbr 40074 1727204617.15459: no more pending results, returning what we have 40074 1727204617.15464: results queue empty 40074 1727204617.15465: checking for any_errors_fatal 40074 1727204617.15466: done checking for any_errors_fatal 40074 1727204617.15467: checking for max_fail_percentage 40074 1727204617.15469: done checking for max_fail_percentage 40074 1727204617.15470: checking to see if all hosts have failed and the running result is not ok 40074 1727204617.15471: done checking to see if all hosts have failed 40074 1727204617.15472: getting the remaining hosts for this loop 40074 1727204617.15473: done getting the remaining hosts for this loop 40074 1727204617.15478: getting the next task for host managed-node2 40074 1727204617.15484: done getting next task for host managed-node2 40074 1727204617.15486: ^ task is: TASK: Set current_interfaces 40074 1727204617.15492: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204617.15495: getting variables 40074 1727204617.15497: in VariableManager get_vars() 40074 1727204617.15539: Calling all_inventory to load vars for managed-node2 40074 1727204617.15542: Calling groups_inventory to load vars for managed-node2 40074 1727204617.15545: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.15556: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.15559: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.15563: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.15738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.16336: done with get_vars() 40074 1727204617.16348: done getting variables 40074 1727204617.16421: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.374) 0:00:10.926 ***** 40074 1727204617.16460: entering _queue_task() for managed-node2/set_fact 40074 1727204617.16759: worker is 1 (out of 1 available) 40074 1727204617.16773: exiting _queue_task() for managed-node2/set_fact 40074 1727204617.16788: done queuing things up, now waiting for results queue to drain 40074 1727204617.16894: waiting for pending results... 40074 1727204617.17109: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 40074 1727204617.17300: in run() - task 12b410aa-8751-9fd7-2501-0000000002e1 40074 1727204617.17304: variable 'ansible_search_path' from source: unknown 40074 1727204617.17307: variable 'ansible_search_path' from source: unknown 40074 1727204617.17347: calling self._execute() 40074 1727204617.17464: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.17480: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.17500: variable 'omit' from source: magic vars 40074 1727204617.17986: variable 'ansible_distribution_major_version' from source: facts 40074 1727204617.18061: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204617.18066: variable 'omit' from source: magic vars 40074 1727204617.18097: variable 'omit' from source: magic vars 40074 1727204617.18252: variable '_current_interfaces' from source: set_fact 40074 1727204617.18349: variable 'omit' from source: magic vars 40074 1727204617.18410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204617.18463: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204617.18500: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204617.18596: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204617.18602: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204617.18605: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204617.18608: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.18620: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.18770: Set connection var ansible_pipelining to False 40074 1727204617.18786: Set connection var ansible_shell_executable to /bin/sh 40074 1727204617.18800: Set connection var ansible_shell_type to sh 40074 1727204617.18810: Set connection var ansible_connection to ssh 40074 1727204617.18835: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204617.18906: Set connection var ansible_timeout to 10 40074 1727204617.18909: variable 'ansible_shell_executable' from source: unknown 40074 1727204617.18913: variable 'ansible_connection' from source: unknown 40074 1727204617.18916: variable 'ansible_module_compression' from source: unknown 40074 1727204617.18918: variable 'ansible_shell_type' from source: unknown 40074 1727204617.18920: variable 'ansible_shell_executable' from source: unknown 40074 1727204617.18922: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.18935: variable 'ansible_pipelining' from source: unknown 40074 1727204617.19050: variable 'ansible_timeout' from source: unknown 40074 1727204617.19053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.19141: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204617.19164: variable 'omit' from source: magic vars 40074 1727204617.19191: starting attempt loop 40074 1727204617.19201: running the handler 40074 1727204617.19220: handler run complete 40074 1727204617.19281: attempt loop complete, returning result 40074 1727204617.19284: _execute() done 40074 1727204617.19286: dumping result to json 40074 1727204617.19288: done dumping result, returning 40074 1727204617.19294: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [12b410aa-8751-9fd7-2501-0000000002e1] 40074 1727204617.19297: sending task result for task 12b410aa-8751-9fd7-2501-0000000002e1 40074 1727204617.19505: done sending task result for task 12b410aa-8751-9fd7-2501-0000000002e1 40074 1727204617.19509: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0", "rpltstbr" ] }, "changed": false } 40074 1727204617.19585: no more pending results, returning what we have 40074 1727204617.19592: results queue empty 40074 1727204617.19691: checking for any_errors_fatal 40074 1727204617.19700: done checking for any_errors_fatal 40074 1727204617.19701: checking for max_fail_percentage 40074 1727204617.19703: done checking for max_fail_percentage 40074 1727204617.19704: checking to see if all hosts have failed and the running result is not ok 40074 1727204617.19705: done checking to see if all hosts have failed 40074 1727204617.19706: getting the remaining hosts for this loop 40074 1727204617.19708: done getting the remaining hosts for this loop 40074 1727204617.19712: getting the next task for host managed-node2 40074 1727204617.19720: done getting next task for host managed-node2 40074 1727204617.19723: ^ task is: TASK: Show current_interfaces 40074 1727204617.19727: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204617.19734: getting variables 40074 1727204617.19736: in VariableManager get_vars() 40074 1727204617.19776: Calling all_inventory to load vars for managed-node2 40074 1727204617.19779: Calling groups_inventory to load vars for managed-node2 40074 1727204617.19782: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.19869: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.19874: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.19879: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.20166: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.20538: done with get_vars() 40074 1727204617.20550: done getting variables 40074 1727204617.20617: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.041) 0:00:10.968 ***** 40074 1727204617.20654: entering _queue_task() for managed-node2/debug 40074 1727204617.20859: worker is 1 (out of 1 available) 40074 1727204617.20874: exiting _queue_task() for managed-node2/debug 40074 1727204617.20886: done queuing things up, now waiting for results queue to drain 40074 1727204617.20888: waiting for pending results... 40074 1727204617.21060: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 40074 1727204617.21195: in run() - task 12b410aa-8751-9fd7-2501-000000000283 40074 1727204617.21205: variable 'ansible_search_path' from source: unknown 40074 1727204617.21209: variable 'ansible_search_path' from source: unknown 40074 1727204617.21212: calling self._execute() 40074 1727204617.21267: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.21274: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.21282: variable 'omit' from source: magic vars 40074 1727204617.21598: variable 'ansible_distribution_major_version' from source: facts 40074 1727204617.21608: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204617.21615: variable 'omit' from source: magic vars 40074 1727204617.21652: variable 'omit' from source: magic vars 40074 1727204617.21732: variable 'current_interfaces' from source: set_fact 40074 1727204617.21762: variable 'omit' from source: magic vars 40074 1727204617.21797: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204617.21828: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204617.21849: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204617.21867: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204617.21881: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204617.21911: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204617.21914: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.21919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.22009: Set connection var ansible_pipelining to False 40074 1727204617.22016: Set connection var ansible_shell_executable to /bin/sh 40074 1727204617.22020: Set connection var ansible_shell_type to sh 40074 1727204617.22023: Set connection var ansible_connection to ssh 40074 1727204617.22030: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204617.22039: Set connection var ansible_timeout to 10 40074 1727204617.22060: variable 'ansible_shell_executable' from source: unknown 40074 1727204617.22064: variable 'ansible_connection' from source: unknown 40074 1727204617.22066: variable 'ansible_module_compression' from source: unknown 40074 1727204617.22069: variable 'ansible_shell_type' from source: unknown 40074 1727204617.22077: variable 'ansible_shell_executable' from source: unknown 40074 1727204617.22080: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.22082: variable 'ansible_pipelining' from source: unknown 40074 1727204617.22085: variable 'ansible_timeout' from source: unknown 40074 1727204617.22092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.22212: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204617.22216: variable 'omit' from source: magic vars 40074 1727204617.22223: starting attempt loop 40074 1727204617.22226: running the handler 40074 1727204617.22269: handler run complete 40074 1727204617.22281: attempt loop complete, returning result 40074 1727204617.22284: _execute() done 40074 1727204617.22288: dumping result to json 40074 1727204617.22293: done dumping result, returning 40074 1727204617.22303: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [12b410aa-8751-9fd7-2501-000000000283] 40074 1727204617.22307: sending task result for task 12b410aa-8751-9fd7-2501-000000000283 40074 1727204617.22395: done sending task result for task 12b410aa-8751-9fd7-2501-000000000283 40074 1727204617.22398: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0', 'rpltstbr'] 40074 1727204617.22468: no more pending results, returning what we have 40074 1727204617.22471: results queue empty 40074 1727204617.22472: checking for any_errors_fatal 40074 1727204617.22477: done checking for any_errors_fatal 40074 1727204617.22478: checking for max_fail_percentage 40074 1727204617.22479: done checking for max_fail_percentage 40074 1727204617.22480: checking to see if all hosts have failed and the running result is not ok 40074 1727204617.22481: done checking to see if all hosts have failed 40074 1727204617.22482: getting the remaining hosts for this loop 40074 1727204617.22483: done getting the remaining hosts for this loop 40074 1727204617.22487: getting the next task for host managed-node2 40074 1727204617.22496: done getting next task for host managed-node2 40074 1727204617.22500: ^ task is: TASK: Manage test interface 40074 1727204617.22502: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204617.22505: getting variables 40074 1727204617.22507: in VariableManager get_vars() 40074 1727204617.22545: Calling all_inventory to load vars for managed-node2 40074 1727204617.22548: Calling groups_inventory to load vars for managed-node2 40074 1727204617.22550: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.22560: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.22562: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.22565: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.22766: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.22960: done with get_vars() 40074 1727204617.22968: done getting variables TASK [Manage test interface] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:28 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.023) 0:00:10.992 ***** 40074 1727204617.23041: entering _queue_task() for managed-node2/include_tasks 40074 1727204617.23230: worker is 1 (out of 1 available) 40074 1727204617.23245: exiting _queue_task() for managed-node2/include_tasks 40074 1727204617.23257: done queuing things up, now waiting for results queue to drain 40074 1727204617.23259: waiting for pending results... 40074 1727204617.23427: running TaskExecutor() for managed-node2/TASK: Manage test interface 40074 1727204617.23493: in run() - task 12b410aa-8751-9fd7-2501-000000000011 40074 1727204617.23505: variable 'ansible_search_path' from source: unknown 40074 1727204617.23538: calling self._execute() 40074 1727204617.23612: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.23619: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.23628: variable 'omit' from source: magic vars 40074 1727204617.23931: variable 'ansible_distribution_major_version' from source: facts 40074 1727204617.23944: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204617.23951: _execute() done 40074 1727204617.23954: dumping result to json 40074 1727204617.23959: done dumping result, returning 40074 1727204617.23967: done running TaskExecutor() for managed-node2/TASK: Manage test interface [12b410aa-8751-9fd7-2501-000000000011] 40074 1727204617.23972: sending task result for task 12b410aa-8751-9fd7-2501-000000000011 40074 1727204617.24069: done sending task result for task 12b410aa-8751-9fd7-2501-000000000011 40074 1727204617.24072: WORKER PROCESS EXITING 40074 1727204617.24104: no more pending results, returning what we have 40074 1727204617.24108: in VariableManager get_vars() 40074 1727204617.24147: Calling all_inventory to load vars for managed-node2 40074 1727204617.24150: Calling groups_inventory to load vars for managed-node2 40074 1727204617.24153: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.24164: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.24167: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.24171: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.24344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.24542: done with get_vars() 40074 1727204617.24549: variable 'ansible_search_path' from source: unknown 40074 1727204617.24558: we have included files to process 40074 1727204617.24559: generating all_blocks data 40074 1727204617.24561: done generating all_blocks data 40074 1727204617.24564: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 40074 1727204617.24565: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 40074 1727204617.24566: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 40074 1727204617.24902: in VariableManager get_vars() 40074 1727204617.24919: done with get_vars() 40074 1727204617.25465: done processing included file 40074 1727204617.25466: iterating over new_blocks loaded from include file 40074 1727204617.25467: in VariableManager get_vars() 40074 1727204617.25480: done with get_vars() 40074 1727204617.25482: filtering new block on tags 40074 1727204617.25510: done filtering new block on tags 40074 1727204617.25512: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node2 40074 1727204617.25516: extending task lists for all hosts with included blocks 40074 1727204617.26154: done extending task lists 40074 1727204617.26155: done processing included files 40074 1727204617.26156: results queue empty 40074 1727204617.26156: checking for any_errors_fatal 40074 1727204617.26158: done checking for any_errors_fatal 40074 1727204617.26159: checking for max_fail_percentage 40074 1727204617.26160: done checking for max_fail_percentage 40074 1727204617.26160: checking to see if all hosts have failed and the running result is not ok 40074 1727204617.26161: done checking to see if all hosts have failed 40074 1727204617.26162: getting the remaining hosts for this loop 40074 1727204617.26163: done getting the remaining hosts for this loop 40074 1727204617.26164: getting the next task for host managed-node2 40074 1727204617.26167: done getting next task for host managed-node2 40074 1727204617.26169: ^ task is: TASK: Ensure state in ["present", "absent"] 40074 1727204617.26171: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204617.26172: getting variables 40074 1727204617.26173: in VariableManager get_vars() 40074 1727204617.26183: Calling all_inventory to load vars for managed-node2 40074 1727204617.26185: Calling groups_inventory to load vars for managed-node2 40074 1727204617.26186: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.26193: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.26195: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.26197: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.26350: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.26541: done with get_vars() 40074 1727204617.26549: done getting variables 40074 1727204617.26580: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.035) 0:00:11.027 ***** 40074 1727204617.26604: entering _queue_task() for managed-node2/fail 40074 1727204617.26825: worker is 1 (out of 1 available) 40074 1727204617.26841: exiting _queue_task() for managed-node2/fail 40074 1727204617.26855: done queuing things up, now waiting for results queue to drain 40074 1727204617.26857: waiting for pending results... 40074 1727204617.27027: running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] 40074 1727204617.27102: in run() - task 12b410aa-8751-9fd7-2501-0000000002fc 40074 1727204617.27115: variable 'ansible_search_path' from source: unknown 40074 1727204617.27118: variable 'ansible_search_path' from source: unknown 40074 1727204617.27151: calling self._execute() 40074 1727204617.27235: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.27239: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.27248: variable 'omit' from source: magic vars 40074 1727204617.27557: variable 'ansible_distribution_major_version' from source: facts 40074 1727204617.27570: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204617.27688: variable 'state' from source: include params 40074 1727204617.27694: Evaluated conditional (state not in ["present", "absent"]): False 40074 1727204617.27697: when evaluation is False, skipping this task 40074 1727204617.27703: _execute() done 40074 1727204617.27708: dumping result to json 40074 1727204617.27712: done dumping result, returning 40074 1727204617.27719: done running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] [12b410aa-8751-9fd7-2501-0000000002fc] 40074 1727204617.27724: sending task result for task 12b410aa-8751-9fd7-2501-0000000002fc 40074 1727204617.27815: done sending task result for task 12b410aa-8751-9fd7-2501-0000000002fc 40074 1727204617.27818: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 40074 1727204617.27894: no more pending results, returning what we have 40074 1727204617.27897: results queue empty 40074 1727204617.27898: checking for any_errors_fatal 40074 1727204617.27900: done checking for any_errors_fatal 40074 1727204617.27901: checking for max_fail_percentage 40074 1727204617.27902: done checking for max_fail_percentage 40074 1727204617.27903: checking to see if all hosts have failed and the running result is not ok 40074 1727204617.27904: done checking to see if all hosts have failed 40074 1727204617.27905: getting the remaining hosts for this loop 40074 1727204617.27906: done getting the remaining hosts for this loop 40074 1727204617.27910: getting the next task for host managed-node2 40074 1727204617.27916: done getting next task for host managed-node2 40074 1727204617.27918: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 40074 1727204617.27921: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204617.27924: getting variables 40074 1727204617.27926: in VariableManager get_vars() 40074 1727204617.27966: Calling all_inventory to load vars for managed-node2 40074 1727204617.27969: Calling groups_inventory to load vars for managed-node2 40074 1727204617.27972: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.27980: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.27982: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.27985: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.28152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.28373: done with get_vars() 40074 1727204617.28380: done getting variables 40074 1727204617.28428: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.018) 0:00:11.046 ***** 40074 1727204617.28452: entering _queue_task() for managed-node2/fail 40074 1727204617.28647: worker is 1 (out of 1 available) 40074 1727204617.28663: exiting _queue_task() for managed-node2/fail 40074 1727204617.28675: done queuing things up, now waiting for results queue to drain 40074 1727204617.28677: waiting for pending results... 40074 1727204617.28836: running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] 40074 1727204617.28898: in run() - task 12b410aa-8751-9fd7-2501-0000000002fd 40074 1727204617.28913: variable 'ansible_search_path' from source: unknown 40074 1727204617.28917: variable 'ansible_search_path' from source: unknown 40074 1727204617.28948: calling self._execute() 40074 1727204617.29039: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.29043: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.29047: variable 'omit' from source: magic vars 40074 1727204617.29343: variable 'ansible_distribution_major_version' from source: facts 40074 1727204617.29354: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204617.29474: variable 'type' from source: set_fact 40074 1727204617.29480: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 40074 1727204617.29483: when evaluation is False, skipping this task 40074 1727204617.29488: _execute() done 40074 1727204617.29495: dumping result to json 40074 1727204617.29498: done dumping result, returning 40074 1727204617.29506: done running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] [12b410aa-8751-9fd7-2501-0000000002fd] 40074 1727204617.29511: sending task result for task 12b410aa-8751-9fd7-2501-0000000002fd 40074 1727204617.29605: done sending task result for task 12b410aa-8751-9fd7-2501-0000000002fd 40074 1727204617.29609: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 40074 1727204617.29657: no more pending results, returning what we have 40074 1727204617.29660: results queue empty 40074 1727204617.29661: checking for any_errors_fatal 40074 1727204617.29667: done checking for any_errors_fatal 40074 1727204617.29668: checking for max_fail_percentage 40074 1727204617.29670: done checking for max_fail_percentage 40074 1727204617.29671: checking to see if all hosts have failed and the running result is not ok 40074 1727204617.29672: done checking to see if all hosts have failed 40074 1727204617.29673: getting the remaining hosts for this loop 40074 1727204617.29674: done getting the remaining hosts for this loop 40074 1727204617.29678: getting the next task for host managed-node2 40074 1727204617.29683: done getting next task for host managed-node2 40074 1727204617.29685: ^ task is: TASK: Include the task 'show_interfaces.yml' 40074 1727204617.29688: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204617.29700: getting variables 40074 1727204617.29702: in VariableManager get_vars() 40074 1727204617.29735: Calling all_inventory to load vars for managed-node2 40074 1727204617.29737: Calling groups_inventory to load vars for managed-node2 40074 1727204617.29739: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.29746: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.29748: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.29750: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.29912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.30114: done with get_vars() 40074 1727204617.30122: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.017) 0:00:11.063 ***** 40074 1727204617.30196: entering _queue_task() for managed-node2/include_tasks 40074 1727204617.30394: worker is 1 (out of 1 available) 40074 1727204617.30410: exiting _queue_task() for managed-node2/include_tasks 40074 1727204617.30421: done queuing things up, now waiting for results queue to drain 40074 1727204617.30422: waiting for pending results... 40074 1727204617.30573: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 40074 1727204617.30640: in run() - task 12b410aa-8751-9fd7-2501-0000000002fe 40074 1727204617.30650: variable 'ansible_search_path' from source: unknown 40074 1727204617.30663: variable 'ansible_search_path' from source: unknown 40074 1727204617.30693: calling self._execute() 40074 1727204617.30776: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.30784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.30795: variable 'omit' from source: magic vars 40074 1727204617.31108: variable 'ansible_distribution_major_version' from source: facts 40074 1727204617.31118: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204617.31125: _execute() done 40074 1727204617.31129: dumping result to json 40074 1727204617.31136: done dumping result, returning 40074 1727204617.31142: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-9fd7-2501-0000000002fe] 40074 1727204617.31148: sending task result for task 12b410aa-8751-9fd7-2501-0000000002fe 40074 1727204617.31240: done sending task result for task 12b410aa-8751-9fd7-2501-0000000002fe 40074 1727204617.31243: WORKER PROCESS EXITING 40074 1727204617.31274: no more pending results, returning what we have 40074 1727204617.31278: in VariableManager get_vars() 40074 1727204617.31321: Calling all_inventory to load vars for managed-node2 40074 1727204617.31324: Calling groups_inventory to load vars for managed-node2 40074 1727204617.31327: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.31337: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.31340: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.31344: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.31547: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.31740: done with get_vars() 40074 1727204617.31747: variable 'ansible_search_path' from source: unknown 40074 1727204617.31748: variable 'ansible_search_path' from source: unknown 40074 1727204617.31775: we have included files to process 40074 1727204617.31776: generating all_blocks data 40074 1727204617.31777: done generating all_blocks data 40074 1727204617.31780: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 40074 1727204617.31781: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 40074 1727204617.31783: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 40074 1727204617.31865: in VariableManager get_vars() 40074 1727204617.31882: done with get_vars() 40074 1727204617.31971: done processing included file 40074 1727204617.31973: iterating over new_blocks loaded from include file 40074 1727204617.31974: in VariableManager get_vars() 40074 1727204617.31988: done with get_vars() 40074 1727204617.31991: filtering new block on tags 40074 1727204617.32005: done filtering new block on tags 40074 1727204617.32006: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 40074 1727204617.32010: extending task lists for all hosts with included blocks 40074 1727204617.32322: done extending task lists 40074 1727204617.32323: done processing included files 40074 1727204617.32324: results queue empty 40074 1727204617.32324: checking for any_errors_fatal 40074 1727204617.32326: done checking for any_errors_fatal 40074 1727204617.32327: checking for max_fail_percentage 40074 1727204617.32327: done checking for max_fail_percentage 40074 1727204617.32328: checking to see if all hosts have failed and the running result is not ok 40074 1727204617.32329: done checking to see if all hosts have failed 40074 1727204617.32329: getting the remaining hosts for this loop 40074 1727204617.32330: done getting the remaining hosts for this loop 40074 1727204617.32333: getting the next task for host managed-node2 40074 1727204617.32336: done getting next task for host managed-node2 40074 1727204617.32337: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 40074 1727204617.32339: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204617.32341: getting variables 40074 1727204617.32342: in VariableManager get_vars() 40074 1727204617.32353: Calling all_inventory to load vars for managed-node2 40074 1727204617.32355: Calling groups_inventory to load vars for managed-node2 40074 1727204617.32358: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.32363: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.32365: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.32367: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.32524: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.32716: done with get_vars() 40074 1727204617.32723: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.025) 0:00:11.089 ***** 40074 1727204617.32779: entering _queue_task() for managed-node2/include_tasks 40074 1727204617.32985: worker is 1 (out of 1 available) 40074 1727204617.33000: exiting _queue_task() for managed-node2/include_tasks 40074 1727204617.33014: done queuing things up, now waiting for results queue to drain 40074 1727204617.33015: waiting for pending results... 40074 1727204617.33182: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 40074 1727204617.33265: in run() - task 12b410aa-8751-9fd7-2501-000000000374 40074 1727204617.33279: variable 'ansible_search_path' from source: unknown 40074 1727204617.33283: variable 'ansible_search_path' from source: unknown 40074 1727204617.33314: calling self._execute() 40074 1727204617.33391: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.33399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.33408: variable 'omit' from source: magic vars 40074 1727204617.33714: variable 'ansible_distribution_major_version' from source: facts 40074 1727204617.33724: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204617.33731: _execute() done 40074 1727204617.33738: dumping result to json 40074 1727204617.33741: done dumping result, returning 40074 1727204617.33748: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-9fd7-2501-000000000374] 40074 1727204617.33754: sending task result for task 12b410aa-8751-9fd7-2501-000000000374 40074 1727204617.33842: done sending task result for task 12b410aa-8751-9fd7-2501-000000000374 40074 1727204617.33845: WORKER PROCESS EXITING 40074 1727204617.33876: no more pending results, returning what we have 40074 1727204617.33880: in VariableManager get_vars() 40074 1727204617.33923: Calling all_inventory to load vars for managed-node2 40074 1727204617.33927: Calling groups_inventory to load vars for managed-node2 40074 1727204617.33930: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.33940: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.33943: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.33946: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.34125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.34323: done with get_vars() 40074 1727204617.34329: variable 'ansible_search_path' from source: unknown 40074 1727204617.34330: variable 'ansible_search_path' from source: unknown 40074 1727204617.34375: we have included files to process 40074 1727204617.34376: generating all_blocks data 40074 1727204617.34378: done generating all_blocks data 40074 1727204617.34379: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 40074 1727204617.34380: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 40074 1727204617.34381: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 40074 1727204617.34588: done processing included file 40074 1727204617.34592: iterating over new_blocks loaded from include file 40074 1727204617.34593: in VariableManager get_vars() 40074 1727204617.34609: done with get_vars() 40074 1727204617.34610: filtering new block on tags 40074 1727204617.34627: done filtering new block on tags 40074 1727204617.34629: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 40074 1727204617.34633: extending task lists for all hosts with included blocks 40074 1727204617.34750: done extending task lists 40074 1727204617.34751: done processing included files 40074 1727204617.34752: results queue empty 40074 1727204617.34752: checking for any_errors_fatal 40074 1727204617.34754: done checking for any_errors_fatal 40074 1727204617.34755: checking for max_fail_percentage 40074 1727204617.34756: done checking for max_fail_percentage 40074 1727204617.34756: checking to see if all hosts have failed and the running result is not ok 40074 1727204617.34757: done checking to see if all hosts have failed 40074 1727204617.34758: getting the remaining hosts for this loop 40074 1727204617.34758: done getting the remaining hosts for this loop 40074 1727204617.34760: getting the next task for host managed-node2 40074 1727204617.34765: done getting next task for host managed-node2 40074 1727204617.34766: ^ task is: TASK: Gather current interface info 40074 1727204617.34769: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204617.34771: getting variables 40074 1727204617.34772: in VariableManager get_vars() 40074 1727204617.34782: Calling all_inventory to load vars for managed-node2 40074 1727204617.34783: Calling groups_inventory to load vars for managed-node2 40074 1727204617.34785: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.34790: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.34792: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.34795: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.34951: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.35140: done with get_vars() 40074 1727204617.35148: done getting variables 40074 1727204617.35183: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.024) 0:00:11.113 ***** 40074 1727204617.35207: entering _queue_task() for managed-node2/command 40074 1727204617.35401: worker is 1 (out of 1 available) 40074 1727204617.35416: exiting _queue_task() for managed-node2/command 40074 1727204617.35429: done queuing things up, now waiting for results queue to drain 40074 1727204617.35431: waiting for pending results... 40074 1727204617.35604: running TaskExecutor() for managed-node2/TASK: Gather current interface info 40074 1727204617.35685: in run() - task 12b410aa-8751-9fd7-2501-0000000003ab 40074 1727204617.35699: variable 'ansible_search_path' from source: unknown 40074 1727204617.35702: variable 'ansible_search_path' from source: unknown 40074 1727204617.35733: calling self._execute() 40074 1727204617.35809: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.35815: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.35824: variable 'omit' from source: magic vars 40074 1727204617.36126: variable 'ansible_distribution_major_version' from source: facts 40074 1727204617.36138: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204617.36145: variable 'omit' from source: magic vars 40074 1727204617.36191: variable 'omit' from source: magic vars 40074 1727204617.36220: variable 'omit' from source: magic vars 40074 1727204617.36255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204617.36290: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204617.36309: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204617.36325: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204617.36339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204617.36367: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204617.36373: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.36376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.36465: Set connection var ansible_pipelining to False 40074 1727204617.36472: Set connection var ansible_shell_executable to /bin/sh 40074 1727204617.36475: Set connection var ansible_shell_type to sh 40074 1727204617.36478: Set connection var ansible_connection to ssh 40074 1727204617.36486: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204617.36497: Set connection var ansible_timeout to 10 40074 1727204617.36519: variable 'ansible_shell_executable' from source: unknown 40074 1727204617.36523: variable 'ansible_connection' from source: unknown 40074 1727204617.36526: variable 'ansible_module_compression' from source: unknown 40074 1727204617.36528: variable 'ansible_shell_type' from source: unknown 40074 1727204617.36531: variable 'ansible_shell_executable' from source: unknown 40074 1727204617.36538: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.36543: variable 'ansible_pipelining' from source: unknown 40074 1727204617.36547: variable 'ansible_timeout' from source: unknown 40074 1727204617.36552: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.36669: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204617.36679: variable 'omit' from source: magic vars 40074 1727204617.36686: starting attempt loop 40074 1727204617.36691: running the handler 40074 1727204617.36706: _low_level_execute_command(): starting 40074 1727204617.36718: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204617.37268: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204617.37272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.37275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204617.37277: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.37334: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204617.37337: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204617.37396: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204617.39154: stdout chunk (state=3): >>>/root <<< 40074 1727204617.39260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204617.39312: stderr chunk (state=3): >>><<< 40074 1727204617.39315: stdout chunk (state=3): >>><<< 40074 1727204617.39341: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204617.39353: _low_level_execute_command(): starting 40074 1727204617.39359: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355 `" && echo ansible-tmp-1727204617.3934073-40700-181774140970355="` echo /root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355 `" ) && sleep 0' 40074 1727204617.39821: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204617.39824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204617.39827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204617.39836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.39895: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204617.39898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204617.39935: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204617.42016: stdout chunk (state=3): >>>ansible-tmp-1727204617.3934073-40700-181774140970355=/root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355 <<< 40074 1727204617.42139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204617.42185: stderr chunk (state=3): >>><<< 40074 1727204617.42188: stdout chunk (state=3): >>><<< 40074 1727204617.42209: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204617.3934073-40700-181774140970355=/root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204617.42237: variable 'ansible_module_compression' from source: unknown 40074 1727204617.42279: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204617.42311: variable 'ansible_facts' from source: unknown 40074 1727204617.42373: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355/AnsiballZ_command.py 40074 1727204617.42481: Sending initial data 40074 1727204617.42485: Sent initial data (156 bytes) 40074 1727204617.42959: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204617.42962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.42965: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204617.42971: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204617.42973: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.43023: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204617.43027: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204617.43071: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204617.44757: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204617.44796: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204617.44836: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpw53i6px6 /root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355/AnsiballZ_command.py <<< 40074 1727204617.44844: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355/AnsiballZ_command.py" <<< 40074 1727204617.44874: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpw53i6px6" to remote "/root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355/AnsiballZ_command.py" <<< 40074 1727204617.44877: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355/AnsiballZ_command.py" <<< 40074 1727204617.45660: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204617.45738: stderr chunk (state=3): >>><<< 40074 1727204617.45742: stdout chunk (state=3): >>><<< 40074 1727204617.45762: done transferring module to remote 40074 1727204617.45774: _low_level_execute_command(): starting 40074 1727204617.45780: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355/ /root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355/AnsiballZ_command.py && sleep 0' 40074 1727204617.46279: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204617.46283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204617.46285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204617.46287: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204617.46292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.46343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204617.46347: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204617.46394: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204617.48336: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204617.48391: stderr chunk (state=3): >>><<< 40074 1727204617.48396: stdout chunk (state=3): >>><<< 40074 1727204617.48413: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204617.48416: _low_level_execute_command(): starting 40074 1727204617.48425: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355/AnsiballZ_command.py && sleep 0' 40074 1727204617.48920: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204617.48924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.48928: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204617.48933: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.48987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204617.48994: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204617.49049: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204617.67003: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:37.665629", "end": "2024-09-24 15:03:37.669043", "delta": "0:00:00.003414", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204617.68673: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204617.68736: stderr chunk (state=3): >>><<< 40074 1727204617.68740: stdout chunk (state=3): >>><<< 40074 1727204617.68764: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nethtest0\nlo\npeerethtest0\nrpltstbr", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 15:03:37.665629", "end": "2024-09-24 15:03:37.669043", "delta": "0:00:00.003414", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204617.68806: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204617.68815: _low_level_execute_command(): starting 40074 1727204617.68821: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204617.3934073-40700-181774140970355/ > /dev/null 2>&1 && sleep 0' 40074 1727204617.69303: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204617.69307: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204617.69310: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204617.69312: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204617.69341: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.69377: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204617.69383: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204617.69385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204617.69423: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204617.71376: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204617.71422: stderr chunk (state=3): >>><<< 40074 1727204617.71426: stdout chunk (state=3): >>><<< 40074 1727204617.71441: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204617.71449: handler run complete 40074 1727204617.71472: Evaluated conditional (False): False 40074 1727204617.71486: attempt loop complete, returning result 40074 1727204617.71491: _execute() done 40074 1727204617.71494: dumping result to json 40074 1727204617.71504: done dumping result, returning 40074 1727204617.71512: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [12b410aa-8751-9fd7-2501-0000000003ab] 40074 1727204617.71517: sending task result for task 12b410aa-8751-9fd7-2501-0000000003ab 40074 1727204617.71630: done sending task result for task 12b410aa-8751-9fd7-2501-0000000003ab 40074 1727204617.71635: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003414", "end": "2024-09-24 15:03:37.669043", "rc": 0, "start": "2024-09-24 15:03:37.665629" } STDOUT: bonding_masters eth0 ethtest0 lo peerethtest0 rpltstbr 40074 1727204617.71726: no more pending results, returning what we have 40074 1727204617.71729: results queue empty 40074 1727204617.71733: checking for any_errors_fatal 40074 1727204617.71734: done checking for any_errors_fatal 40074 1727204617.71735: checking for max_fail_percentage 40074 1727204617.71737: done checking for max_fail_percentage 40074 1727204617.71738: checking to see if all hosts have failed and the running result is not ok 40074 1727204617.71739: done checking to see if all hosts have failed 40074 1727204617.71740: getting the remaining hosts for this loop 40074 1727204617.71741: done getting the remaining hosts for this loop 40074 1727204617.71746: getting the next task for host managed-node2 40074 1727204617.71753: done getting next task for host managed-node2 40074 1727204617.71757: ^ task is: TASK: Set current_interfaces 40074 1727204617.71762: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204617.71767: getting variables 40074 1727204617.71768: in VariableManager get_vars() 40074 1727204617.71814: Calling all_inventory to load vars for managed-node2 40074 1727204617.71818: Calling groups_inventory to load vars for managed-node2 40074 1727204617.71820: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.71834: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.71837: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.71841: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.72043: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.72250: done with get_vars() 40074 1727204617.72259: done getting variables 40074 1727204617.72310: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.371) 0:00:11.485 ***** 40074 1727204617.72339: entering _queue_task() for managed-node2/set_fact 40074 1727204617.72554: worker is 1 (out of 1 available) 40074 1727204617.72568: exiting _queue_task() for managed-node2/set_fact 40074 1727204617.72580: done queuing things up, now waiting for results queue to drain 40074 1727204617.72582: waiting for pending results... 40074 1727204617.72756: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 40074 1727204617.72844: in run() - task 12b410aa-8751-9fd7-2501-0000000003ac 40074 1727204617.72856: variable 'ansible_search_path' from source: unknown 40074 1727204617.72860: variable 'ansible_search_path' from source: unknown 40074 1727204617.72895: calling self._execute() 40074 1727204617.72972: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.72978: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.72988: variable 'omit' from source: magic vars 40074 1727204617.73301: variable 'ansible_distribution_major_version' from source: facts 40074 1727204617.73312: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204617.73319: variable 'omit' from source: magic vars 40074 1727204617.73371: variable 'omit' from source: magic vars 40074 1727204617.73458: variable '_current_interfaces' from source: set_fact 40074 1727204617.73525: variable 'omit' from source: magic vars 40074 1727204617.73561: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204617.73596: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204617.73613: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204617.73630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204617.73642: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204617.73670: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204617.73675: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.73678: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.73766: Set connection var ansible_pipelining to False 40074 1727204617.73772: Set connection var ansible_shell_executable to /bin/sh 40074 1727204617.73775: Set connection var ansible_shell_type to sh 40074 1727204617.73779: Set connection var ansible_connection to ssh 40074 1727204617.73787: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204617.73795: Set connection var ansible_timeout to 10 40074 1727204617.73821: variable 'ansible_shell_executable' from source: unknown 40074 1727204617.73824: variable 'ansible_connection' from source: unknown 40074 1727204617.73827: variable 'ansible_module_compression' from source: unknown 40074 1727204617.73829: variable 'ansible_shell_type' from source: unknown 40074 1727204617.73835: variable 'ansible_shell_executable' from source: unknown 40074 1727204617.73837: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.73843: variable 'ansible_pipelining' from source: unknown 40074 1727204617.73846: variable 'ansible_timeout' from source: unknown 40074 1727204617.73851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.73969: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204617.73979: variable 'omit' from source: magic vars 40074 1727204617.73986: starting attempt loop 40074 1727204617.73991: running the handler 40074 1727204617.74003: handler run complete 40074 1727204617.74018: attempt loop complete, returning result 40074 1727204617.74022: _execute() done 40074 1727204617.74025: dumping result to json 40074 1727204617.74028: done dumping result, returning 40074 1727204617.74034: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [12b410aa-8751-9fd7-2501-0000000003ac] 40074 1727204617.74036: sending task result for task 12b410aa-8751-9fd7-2501-0000000003ac 40074 1727204617.74124: done sending task result for task 12b410aa-8751-9fd7-2501-0000000003ac 40074 1727204617.74127: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "ethtest0", "lo", "peerethtest0", "rpltstbr" ] }, "changed": false } 40074 1727204617.74197: no more pending results, returning what we have 40074 1727204617.74201: results queue empty 40074 1727204617.74202: checking for any_errors_fatal 40074 1727204617.74208: done checking for any_errors_fatal 40074 1727204617.74209: checking for max_fail_percentage 40074 1727204617.74211: done checking for max_fail_percentage 40074 1727204617.74212: checking to see if all hosts have failed and the running result is not ok 40074 1727204617.74213: done checking to see if all hosts have failed 40074 1727204617.74214: getting the remaining hosts for this loop 40074 1727204617.74215: done getting the remaining hosts for this loop 40074 1727204617.74219: getting the next task for host managed-node2 40074 1727204617.74227: done getting next task for host managed-node2 40074 1727204617.74230: ^ task is: TASK: Show current_interfaces 40074 1727204617.74236: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204617.74241: getting variables 40074 1727204617.74243: in VariableManager get_vars() 40074 1727204617.74277: Calling all_inventory to load vars for managed-node2 40074 1727204617.74281: Calling groups_inventory to load vars for managed-node2 40074 1727204617.74283: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.74293: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.74296: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.74298: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.74459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.74659: done with get_vars() 40074 1727204617.74667: done getting variables 40074 1727204617.74716: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.024) 0:00:11.509 ***** 40074 1727204617.74744: entering _queue_task() for managed-node2/debug 40074 1727204617.74948: worker is 1 (out of 1 available) 40074 1727204617.74963: exiting _queue_task() for managed-node2/debug 40074 1727204617.74975: done queuing things up, now waiting for results queue to drain 40074 1727204617.74977: waiting for pending results... 40074 1727204617.75139: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 40074 1727204617.75214: in run() - task 12b410aa-8751-9fd7-2501-000000000375 40074 1727204617.75227: variable 'ansible_search_path' from source: unknown 40074 1727204617.75234: variable 'ansible_search_path' from source: unknown 40074 1727204617.75262: calling self._execute() 40074 1727204617.75337: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.75342: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.75350: variable 'omit' from source: magic vars 40074 1727204617.75646: variable 'ansible_distribution_major_version' from source: facts 40074 1727204617.75658: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204617.75668: variable 'omit' from source: magic vars 40074 1727204617.75712: variable 'omit' from source: magic vars 40074 1727204617.75799: variable 'current_interfaces' from source: set_fact 40074 1727204617.75824: variable 'omit' from source: magic vars 40074 1727204617.75858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204617.75898: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204617.75915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204617.75934: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204617.75943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204617.75972: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204617.75976: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.75986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.76066: Set connection var ansible_pipelining to False 40074 1727204617.76073: Set connection var ansible_shell_executable to /bin/sh 40074 1727204617.76077: Set connection var ansible_shell_type to sh 40074 1727204617.76079: Set connection var ansible_connection to ssh 40074 1727204617.76088: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204617.76096: Set connection var ansible_timeout to 10 40074 1727204617.76121: variable 'ansible_shell_executable' from source: unknown 40074 1727204617.76125: variable 'ansible_connection' from source: unknown 40074 1727204617.76128: variable 'ansible_module_compression' from source: unknown 40074 1727204617.76130: variable 'ansible_shell_type' from source: unknown 40074 1727204617.76137: variable 'ansible_shell_executable' from source: unknown 40074 1727204617.76139: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.76146: variable 'ansible_pipelining' from source: unknown 40074 1727204617.76148: variable 'ansible_timeout' from source: unknown 40074 1727204617.76154: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.76272: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204617.76282: variable 'omit' from source: magic vars 40074 1727204617.76288: starting attempt loop 40074 1727204617.76293: running the handler 40074 1727204617.76340: handler run complete 40074 1727204617.76352: attempt loop complete, returning result 40074 1727204617.76355: _execute() done 40074 1727204617.76358: dumping result to json 40074 1727204617.76364: done dumping result, returning 40074 1727204617.76371: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [12b410aa-8751-9fd7-2501-000000000375] 40074 1727204617.76376: sending task result for task 12b410aa-8751-9fd7-2501-000000000375 40074 1727204617.76463: done sending task result for task 12b410aa-8751-9fd7-2501-000000000375 40074 1727204617.76466: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'ethtest0', 'lo', 'peerethtest0', 'rpltstbr'] 40074 1727204617.76517: no more pending results, returning what we have 40074 1727204617.76520: results queue empty 40074 1727204617.76521: checking for any_errors_fatal 40074 1727204617.76526: done checking for any_errors_fatal 40074 1727204617.76527: checking for max_fail_percentage 40074 1727204617.76529: done checking for max_fail_percentage 40074 1727204617.76529: checking to see if all hosts have failed and the running result is not ok 40074 1727204617.76531: done checking to see if all hosts have failed 40074 1727204617.76531: getting the remaining hosts for this loop 40074 1727204617.76533: done getting the remaining hosts for this loop 40074 1727204617.76537: getting the next task for host managed-node2 40074 1727204617.76544: done getting next task for host managed-node2 40074 1727204617.76547: ^ task is: TASK: Install iproute 40074 1727204617.76550: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204617.76554: getting variables 40074 1727204617.76555: in VariableManager get_vars() 40074 1727204617.76598: Calling all_inventory to load vars for managed-node2 40074 1727204617.76601: Calling groups_inventory to load vars for managed-node2 40074 1727204617.76604: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204617.76613: Calling all_plugins_play to load vars for managed-node2 40074 1727204617.76615: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204617.76618: Calling groups_plugins_play to load vars for managed-node2 40074 1727204617.76816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204617.77007: done with get_vars() 40074 1727204617.77015: done getting variables 40074 1727204617.77061: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 15:03:37 -0400 (0:00:00.023) 0:00:11.532 ***** 40074 1727204617.77083: entering _queue_task() for managed-node2/package 40074 1727204617.77274: worker is 1 (out of 1 available) 40074 1727204617.77288: exiting _queue_task() for managed-node2/package 40074 1727204617.77304: done queuing things up, now waiting for results queue to drain 40074 1727204617.77306: waiting for pending results... 40074 1727204617.77466: running TaskExecutor() for managed-node2/TASK: Install iproute 40074 1727204617.77541: in run() - task 12b410aa-8751-9fd7-2501-0000000002ff 40074 1727204617.77555: variable 'ansible_search_path' from source: unknown 40074 1727204617.77560: variable 'ansible_search_path' from source: unknown 40074 1727204617.77591: calling self._execute() 40074 1727204617.77665: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.77672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.77681: variable 'omit' from source: magic vars 40074 1727204617.77977: variable 'ansible_distribution_major_version' from source: facts 40074 1727204617.77988: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204617.77996: variable 'omit' from source: magic vars 40074 1727204617.78025: variable 'omit' from source: magic vars 40074 1727204617.78194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204617.79858: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204617.80166: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204617.80200: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204617.80228: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204617.80254: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204617.80339: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204617.80363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204617.80391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204617.80424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204617.80439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204617.80528: variable '__network_is_ostree' from source: set_fact 40074 1727204617.80532: variable 'omit' from source: magic vars 40074 1727204617.80560: variable 'omit' from source: magic vars 40074 1727204617.80585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204617.80613: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204617.80628: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204617.80647: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204617.80656: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204617.80684: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204617.80687: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.80696: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.80781: Set connection var ansible_pipelining to False 40074 1727204617.80786: Set connection var ansible_shell_executable to /bin/sh 40074 1727204617.80792: Set connection var ansible_shell_type to sh 40074 1727204617.80795: Set connection var ansible_connection to ssh 40074 1727204617.80806: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204617.80809: Set connection var ansible_timeout to 10 40074 1727204617.80836: variable 'ansible_shell_executable' from source: unknown 40074 1727204617.80839: variable 'ansible_connection' from source: unknown 40074 1727204617.80843: variable 'ansible_module_compression' from source: unknown 40074 1727204617.80846: variable 'ansible_shell_type' from source: unknown 40074 1727204617.80851: variable 'ansible_shell_executable' from source: unknown 40074 1727204617.80854: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204617.80860: variable 'ansible_pipelining' from source: unknown 40074 1727204617.80863: variable 'ansible_timeout' from source: unknown 40074 1727204617.80868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204617.80956: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204617.80965: variable 'omit' from source: magic vars 40074 1727204617.80972: starting attempt loop 40074 1727204617.80975: running the handler 40074 1727204617.80982: variable 'ansible_facts' from source: unknown 40074 1727204617.80984: variable 'ansible_facts' from source: unknown 40074 1727204617.81018: _low_level_execute_command(): starting 40074 1727204617.81026: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204617.81560: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204617.81565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.81570: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204617.81572: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.81627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204617.81636: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204617.81683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204617.83452: stdout chunk (state=3): >>>/root <<< 40074 1727204617.83569: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204617.83622: stderr chunk (state=3): >>><<< 40074 1727204617.83626: stdout chunk (state=3): >>><<< 40074 1727204617.83647: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204617.83658: _low_level_execute_command(): starting 40074 1727204617.83667: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828 `" && echo ansible-tmp-1727204617.8364668-40710-163275333475828="` echo /root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828 `" ) && sleep 0' 40074 1727204617.84128: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204617.84131: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.84134: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204617.84137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.84193: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204617.84197: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204617.84242: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204617.86273: stdout chunk (state=3): >>>ansible-tmp-1727204617.8364668-40710-163275333475828=/root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828 <<< 40074 1727204617.86398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204617.86445: stderr chunk (state=3): >>><<< 40074 1727204617.86449: stdout chunk (state=3): >>><<< 40074 1727204617.86463: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204617.8364668-40710-163275333475828=/root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204617.86490: variable 'ansible_module_compression' from source: unknown 40074 1727204617.86543: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 40074 1727204617.86579: variable 'ansible_facts' from source: unknown 40074 1727204617.86658: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828/AnsiballZ_dnf.py 40074 1727204617.86767: Sending initial data 40074 1727204617.86770: Sent initial data (152 bytes) 40074 1727204617.87221: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204617.87224: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204617.87227: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 40074 1727204617.87229: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204617.87234: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.87287: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204617.87296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204617.87334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204617.88983: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204617.89022: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204617.89055: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpv7nb1q_a /root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828/AnsiballZ_dnf.py <<< 40074 1727204617.89060: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828/AnsiballZ_dnf.py" <<< 40074 1727204617.89092: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpv7nb1q_a" to remote "/root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828/AnsiballZ_dnf.py" <<< 40074 1727204617.90108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204617.90176: stderr chunk (state=3): >>><<< 40074 1727204617.90180: stdout chunk (state=3): >>><<< 40074 1727204617.90203: done transferring module to remote 40074 1727204617.90215: _low_level_execute_command(): starting 40074 1727204617.90225: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828/ /root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828/AnsiballZ_dnf.py && sleep 0' 40074 1727204617.90693: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204617.90700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.90703: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204617.90706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.90764: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204617.90767: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204617.90802: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204617.92766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204617.92818: stderr chunk (state=3): >>><<< 40074 1727204617.92822: stdout chunk (state=3): >>><<< 40074 1727204617.92839: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204617.92842: _low_level_execute_command(): starting 40074 1727204617.92849: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828/AnsiballZ_dnf.py && sleep 0' 40074 1727204617.93283: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204617.93314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.93319: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204617.93376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204617.93381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204617.93428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204619.47572: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 40074 1727204619.53481: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204619.53486: stdout chunk (state=3): >>><<< 40074 1727204619.53488: stderr chunk (state=3): >>><<< 40074 1727204619.53511: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204619.53673: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204619.53682: _low_level_execute_command(): starting 40074 1727204619.53685: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204617.8364668-40710-163275333475828/ > /dev/null 2>&1 && sleep 0' 40074 1727204619.54254: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204619.54270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204619.54283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204619.54308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204619.54325: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204619.54428: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204619.54452: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204619.54530: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204619.56630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204619.56642: stdout chunk (state=3): >>><<< 40074 1727204619.56655: stderr chunk (state=3): >>><<< 40074 1727204619.56686: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204619.56703: handler run complete 40074 1727204619.56936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204619.57179: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204619.57237: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204619.57284: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204619.57323: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204619.57414: variable '__install_status' from source: set_fact 40074 1727204619.57442: Evaluated conditional (__install_status is success): True 40074 1727204619.57696: attempt loop complete, returning result 40074 1727204619.57700: _execute() done 40074 1727204619.57703: dumping result to json 40074 1727204619.57705: done dumping result, returning 40074 1727204619.57707: done running TaskExecutor() for managed-node2/TASK: Install iproute [12b410aa-8751-9fd7-2501-0000000002ff] 40074 1727204619.57709: sending task result for task 12b410aa-8751-9fd7-2501-0000000002ff 40074 1727204619.57798: done sending task result for task 12b410aa-8751-9fd7-2501-0000000002ff 40074 1727204619.57802: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 40074 1727204619.57926: no more pending results, returning what we have 40074 1727204619.57931: results queue empty 40074 1727204619.57932: checking for any_errors_fatal 40074 1727204619.57938: done checking for any_errors_fatal 40074 1727204619.57939: checking for max_fail_percentage 40074 1727204619.57941: done checking for max_fail_percentage 40074 1727204619.57942: checking to see if all hosts have failed and the running result is not ok 40074 1727204619.57943: done checking to see if all hosts have failed 40074 1727204619.57944: getting the remaining hosts for this loop 40074 1727204619.57946: done getting the remaining hosts for this loop 40074 1727204619.57952: getting the next task for host managed-node2 40074 1727204619.57959: done getting next task for host managed-node2 40074 1727204619.57964: ^ task is: TASK: Create veth interface {{ interface }} 40074 1727204619.57967: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204619.57972: getting variables 40074 1727204619.57974: in VariableManager get_vars() 40074 1727204619.58190: Calling all_inventory to load vars for managed-node2 40074 1727204619.58196: Calling groups_inventory to load vars for managed-node2 40074 1727204619.58199: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204619.58212: Calling all_plugins_play to load vars for managed-node2 40074 1727204619.58215: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204619.58219: Calling groups_plugins_play to load vars for managed-node2 40074 1727204619.58631: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204619.59005: done with get_vars() 40074 1727204619.59020: done getting variables 40074 1727204619.59100: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204619.59254: variable 'interface' from source: set_fact TASK [Create veth interface ethtest1] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 15:03:39 -0400 (0:00:01.822) 0:00:13.354 ***** 40074 1727204619.59293: entering _queue_task() for managed-node2/command 40074 1727204619.59651: worker is 1 (out of 1 available) 40074 1727204619.59664: exiting _queue_task() for managed-node2/command 40074 1727204619.59680: done queuing things up, now waiting for results queue to drain 40074 1727204619.59681: waiting for pending results... 40074 1727204619.60033: running TaskExecutor() for managed-node2/TASK: Create veth interface ethtest1 40074 1727204619.60096: in run() - task 12b410aa-8751-9fd7-2501-000000000300 40074 1727204619.60130: variable 'ansible_search_path' from source: unknown 40074 1727204619.60141: variable 'ansible_search_path' from source: unknown 40074 1727204619.60567: variable 'interface' from source: set_fact 40074 1727204619.60592: variable 'interface' from source: set_fact 40074 1727204619.60699: variable 'interface' from source: set_fact 40074 1727204619.60999: Loaded config def from plugin (lookup/items) 40074 1727204619.61019: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 40074 1727204619.61095: variable 'omit' from source: magic vars 40074 1727204619.61184: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204619.61207: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204619.61238: variable 'omit' from source: magic vars 40074 1727204619.61520: variable 'ansible_distribution_major_version' from source: facts 40074 1727204619.61534: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204619.61813: variable 'type' from source: set_fact 40074 1727204619.61824: variable 'state' from source: include params 40074 1727204619.61868: variable 'interface' from source: set_fact 40074 1727204619.61871: variable 'current_interfaces' from source: set_fact 40074 1727204619.61878: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 40074 1727204619.61880: variable 'omit' from source: magic vars 40074 1727204619.61920: variable 'omit' from source: magic vars 40074 1727204619.61988: variable 'item' from source: unknown 40074 1727204619.62074: variable 'item' from source: unknown 40074 1727204619.62193: variable 'omit' from source: magic vars 40074 1727204619.62199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204619.62202: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204619.62205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204619.62232: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204619.62252: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204619.62294: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204619.62309: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204619.62416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204619.62468: Set connection var ansible_pipelining to False 40074 1727204619.62482: Set connection var ansible_shell_executable to /bin/sh 40074 1727204619.62493: Set connection var ansible_shell_type to sh 40074 1727204619.62501: Set connection var ansible_connection to ssh 40074 1727204619.62515: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204619.62537: Set connection var ansible_timeout to 10 40074 1727204619.62571: variable 'ansible_shell_executable' from source: unknown 40074 1727204619.62581: variable 'ansible_connection' from source: unknown 40074 1727204619.62592: variable 'ansible_module_compression' from source: unknown 40074 1727204619.62601: variable 'ansible_shell_type' from source: unknown 40074 1727204619.62609: variable 'ansible_shell_executable' from source: unknown 40074 1727204619.62617: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204619.62632: variable 'ansible_pipelining' from source: unknown 40074 1727204619.62646: variable 'ansible_timeout' from source: unknown 40074 1727204619.62657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204619.62828: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204619.62851: variable 'omit' from source: magic vars 40074 1727204619.62869: starting attempt loop 40074 1727204619.62958: running the handler 40074 1727204619.62962: _low_level_execute_command(): starting 40074 1727204619.62966: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204619.63697: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204619.63715: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204619.63734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204619.63847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204619.63874: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204619.63896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204619.63920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204619.64081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204619.65912: stdout chunk (state=3): >>>/root <<< 40074 1727204619.66087: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204619.66123: stdout chunk (state=3): >>><<< 40074 1727204619.66126: stderr chunk (state=3): >>><<< 40074 1727204619.66146: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204619.66165: _low_level_execute_command(): starting 40074 1727204619.66187: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339 `" && echo ansible-tmp-1727204619.661526-40745-156502119076339="` echo /root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339 `" ) && sleep 0' 40074 1727204619.66815: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204619.66835: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204619.66854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204619.66871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204619.66887: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204619.66959: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204619.67017: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204619.67042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204619.67118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204619.69242: stdout chunk (state=3): >>>ansible-tmp-1727204619.661526-40745-156502119076339=/root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339 <<< 40074 1727204619.69366: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204619.69429: stderr chunk (state=3): >>><<< 40074 1727204619.69440: stdout chunk (state=3): >>><<< 40074 1727204619.69595: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204619.661526-40745-156502119076339=/root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204619.69598: variable 'ansible_module_compression' from source: unknown 40074 1727204619.69601: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204619.69604: variable 'ansible_facts' from source: unknown 40074 1727204619.69695: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339/AnsiballZ_command.py 40074 1727204619.69843: Sending initial data 40074 1727204619.69946: Sent initial data (155 bytes) 40074 1727204619.70506: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204619.70523: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204619.70541: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204619.70563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204619.70604: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204619.70622: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204619.70709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204619.70725: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204619.70741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204619.70773: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204619.70845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204619.72579: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204619.72644: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204619.72698: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpsq3bq8bc /root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339/AnsiballZ_command.py <<< 40074 1727204619.72709: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339/AnsiballZ_command.py" <<< 40074 1727204619.72750: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpsq3bq8bc" to remote "/root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339/AnsiballZ_command.py" <<< 40074 1727204619.73810: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204619.74057: stderr chunk (state=3): >>><<< 40074 1727204619.74060: stdout chunk (state=3): >>><<< 40074 1727204619.74063: done transferring module to remote 40074 1727204619.74065: _low_level_execute_command(): starting 40074 1727204619.74068: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339/ /root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339/AnsiballZ_command.py && sleep 0' 40074 1727204619.74943: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204619.74946: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204619.74949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204619.74952: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204619.75009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204619.75047: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204619.75078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204619.77195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204619.77199: stdout chunk (state=3): >>><<< 40074 1727204619.77201: stderr chunk (state=3): >>><<< 40074 1727204619.77219: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204619.77227: _low_level_execute_command(): starting 40074 1727204619.77241: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339/AnsiballZ_command.py && sleep 0' 40074 1727204619.77999: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204619.78029: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204619.78057: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204619.78072: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204619.78154: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204619.97040: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-24 15:03:39.961464", "end": "2024-09-24 15:03:39.967052", "delta": "0:00:00.005588", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204620.00614: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204620.00673: stderr chunk (state=3): >>><<< 40074 1727204620.00794: stdout chunk (state=3): >>><<< 40074 1727204620.00798: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1"], "start": "2024-09-24 15:03:39.961464", "end": "2024-09-24 15:03:39.967052", "delta": "0:00:00.005588", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add ethtest1 type veth peer name peerethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204620.00802: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add ethtest1 type veth peer name peerethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204620.00805: _low_level_execute_command(): starting 40074 1727204620.00811: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204619.661526-40745-156502119076339/ > /dev/null 2>&1 && sleep 0' 40074 1727204620.01478: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204620.01491: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204620.01504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204620.01522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204620.01537: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204620.01554: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204620.01568: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.01584: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204620.01672: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.01696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204620.01710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204620.01730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.01808: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.07196: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.07201: stdout chunk (state=3): >>><<< 40074 1727204620.07203: stderr chunk (state=3): >>><<< 40074 1727204620.07206: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204620.07208: handler run complete 40074 1727204620.07210: Evaluated conditional (False): False 40074 1727204620.07212: attempt loop complete, returning result 40074 1727204620.07214: variable 'item' from source: unknown 40074 1727204620.07217: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link add ethtest1 type veth peer name peerethtest1) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "ethtest1", "type", "veth", "peer", "name", "peerethtest1" ], "delta": "0:00:00.005588", "end": "2024-09-24 15:03:39.967052", "item": "ip link add ethtest1 type veth peer name peerethtest1", "rc": 0, "start": "2024-09-24 15:03:39.961464" } 40074 1727204620.07387: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204620.07393: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204620.07397: variable 'omit' from source: magic vars 40074 1727204620.07560: variable 'ansible_distribution_major_version' from source: facts 40074 1727204620.07564: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204620.07841: variable 'type' from source: set_fact 40074 1727204620.07853: variable 'state' from source: include params 40074 1727204620.07862: variable 'interface' from source: set_fact 40074 1727204620.07871: variable 'current_interfaces' from source: set_fact 40074 1727204620.07894: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 40074 1727204620.07897: variable 'omit' from source: magic vars 40074 1727204620.07916: variable 'omit' from source: magic vars 40074 1727204620.07994: variable 'item' from source: unknown 40074 1727204620.08069: variable 'item' from source: unknown 40074 1727204620.08168: variable 'omit' from source: magic vars 40074 1727204620.08173: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204620.08176: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204620.08178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204620.08181: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204620.08183: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204620.08185: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204620.08384: Set connection var ansible_pipelining to False 40074 1727204620.08387: Set connection var ansible_shell_executable to /bin/sh 40074 1727204620.08389: Set connection var ansible_shell_type to sh 40074 1727204620.08393: Set connection var ansible_connection to ssh 40074 1727204620.08396: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204620.08398: Set connection var ansible_timeout to 10 40074 1727204620.08404: variable 'ansible_shell_executable' from source: unknown 40074 1727204620.08410: variable 'ansible_connection' from source: unknown 40074 1727204620.08412: variable 'ansible_module_compression' from source: unknown 40074 1727204620.08415: variable 'ansible_shell_type' from source: unknown 40074 1727204620.08417: variable 'ansible_shell_executable' from source: unknown 40074 1727204620.08420: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204620.08422: variable 'ansible_pipelining' from source: unknown 40074 1727204620.08424: variable 'ansible_timeout' from source: unknown 40074 1727204620.08519: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204620.08544: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204620.08558: variable 'omit' from source: magic vars 40074 1727204620.08568: starting attempt loop 40074 1727204620.08574: running the handler 40074 1727204620.08585: _low_level_execute_command(): starting 40074 1727204620.08594: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204620.09317: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204620.09335: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204620.09414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.09478: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204620.09512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.09595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.11391: stdout chunk (state=3): >>>/root <<< 40074 1727204620.11574: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.11584: stdout chunk (state=3): >>><<< 40074 1727204620.11597: stderr chunk (state=3): >>><<< 40074 1727204620.11616: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204620.11712: _low_level_execute_command(): starting 40074 1727204620.11715: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892 `" && echo ansible-tmp-1727204620.1162207-40745-135765309169892="` echo /root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892 `" ) && sleep 0' 40074 1727204620.12273: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204620.12292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204620.12350: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.12424: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204620.12453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204620.12484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.12562: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.14629: stdout chunk (state=3): >>>ansible-tmp-1727204620.1162207-40745-135765309169892=/root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892 <<< 40074 1727204620.14843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.14847: stdout chunk (state=3): >>><<< 40074 1727204620.14850: stderr chunk (state=3): >>><<< 40074 1727204620.14994: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204620.1162207-40745-135765309169892=/root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204620.14998: variable 'ansible_module_compression' from source: unknown 40074 1727204620.15000: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204620.15002: variable 'ansible_facts' from source: unknown 40074 1727204620.15068: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892/AnsiballZ_command.py 40074 1727204620.15211: Sending initial data 40074 1727204620.15236: Sent initial data (156 bytes) 40074 1727204620.15876: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204620.16005: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 40074 1727204620.16026: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204620.16041: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.16122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.17845: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204620.17909: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204620.17956: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892/AnsiballZ_command.py" <<< 40074 1727204620.18060: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpr4aaqqpu /root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892/AnsiballZ_command.py <<< 40074 1727204620.18064: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpr4aaqqpu" to remote "/root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892/AnsiballZ_command.py" <<< 40074 1727204620.19104: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.19194: stderr chunk (state=3): >>><<< 40074 1727204620.19206: stdout chunk (state=3): >>><<< 40074 1727204620.19235: done transferring module to remote 40074 1727204620.19251: _low_level_execute_command(): starting 40074 1727204620.19262: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892/ /root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892/AnsiballZ_command.py && sleep 0' 40074 1727204620.19904: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204620.19943: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204620.19958: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.19995: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204620.20007: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.20092: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204620.20110: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204620.20130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.20205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.22255: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.22277: stderr chunk (state=3): >>><<< 40074 1727204620.22287: stdout chunk (state=3): >>><<< 40074 1727204620.22314: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204620.22325: _low_level_execute_command(): starting 40074 1727204620.22336: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892/AnsiballZ_command.py && sleep 0' 40074 1727204620.23010: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204620.23026: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204620.23039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204620.23065: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204620.23173: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204620.23198: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.23292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.41500: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-24 15:03:40.409943", "end": "2024-09-24 15:03:40.413957", "delta": "0:00:00.004014", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204620.43386: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204620.43426: stdout chunk (state=3): >>><<< 40074 1727204620.43429: stderr chunk (state=3): >>><<< 40074 1727204620.43450: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerethtest1", "up"], "start": "2024-09-24 15:03:40.409943", "end": "2024-09-24 15:03:40.413957", "delta": "0:00:00.004014", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204620.43591: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204620.43595: _low_level_execute_command(): starting 40074 1727204620.43597: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204620.1162207-40745-135765309169892/ > /dev/null 2>&1 && sleep 0' 40074 1727204620.44169: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204620.44185: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204620.44308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204620.44357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.44393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.46460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.46464: stdout chunk (state=3): >>><<< 40074 1727204620.46467: stderr chunk (state=3): >>><<< 40074 1727204620.46503: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204620.46694: handler run complete 40074 1727204620.46698: Evaluated conditional (False): False 40074 1727204620.46700: attempt loop complete, returning result 40074 1727204620.46702: variable 'item' from source: unknown 40074 1727204620.46704: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set peerethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerethtest1", "up" ], "delta": "0:00:00.004014", "end": "2024-09-24 15:03:40.413957", "item": "ip link set peerethtest1 up", "rc": 0, "start": "2024-09-24 15:03:40.409943" } 40074 1727204620.46941: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204620.46957: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204620.46973: variable 'omit' from source: magic vars 40074 1727204620.47201: variable 'ansible_distribution_major_version' from source: facts 40074 1727204620.47221: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204620.47497: variable 'type' from source: set_fact 40074 1727204620.47547: variable 'state' from source: include params 40074 1727204620.47550: variable 'interface' from source: set_fact 40074 1727204620.47552: variable 'current_interfaces' from source: set_fact 40074 1727204620.47555: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 40074 1727204620.47557: variable 'omit' from source: magic vars 40074 1727204620.47564: variable 'omit' from source: magic vars 40074 1727204620.47624: variable 'item' from source: unknown 40074 1727204620.47713: variable 'item' from source: unknown 40074 1727204620.47736: variable 'omit' from source: magic vars 40074 1727204620.47796: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204620.47799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204620.47802: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204620.47814: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204620.47822: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204620.47829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204620.47943: Set connection var ansible_pipelining to False 40074 1727204620.47980: Set connection var ansible_shell_executable to /bin/sh 40074 1727204620.47983: Set connection var ansible_shell_type to sh 40074 1727204620.47985: Set connection var ansible_connection to ssh 40074 1727204620.47987: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204620.47991: Set connection var ansible_timeout to 10 40074 1727204620.48025: variable 'ansible_shell_executable' from source: unknown 40074 1727204620.48089: variable 'ansible_connection' from source: unknown 40074 1727204620.48093: variable 'ansible_module_compression' from source: unknown 40074 1727204620.48097: variable 'ansible_shell_type' from source: unknown 40074 1727204620.48099: variable 'ansible_shell_executable' from source: unknown 40074 1727204620.48101: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204620.48103: variable 'ansible_pipelining' from source: unknown 40074 1727204620.48106: variable 'ansible_timeout' from source: unknown 40074 1727204620.48108: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204620.48206: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204620.48220: variable 'omit' from source: magic vars 40074 1727204620.48235: starting attempt loop 40074 1727204620.48242: running the handler 40074 1727204620.48258: _low_level_execute_command(): starting 40074 1727204620.48266: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204620.48942: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204620.48962: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204620.48992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204620.49083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204620.49087: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.49126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204620.49142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204620.49166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.49241: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.51009: stdout chunk (state=3): >>>/root <<< 40074 1727204620.51295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.51299: stdout chunk (state=3): >>><<< 40074 1727204620.51302: stderr chunk (state=3): >>><<< 40074 1727204620.51304: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204620.51307: _low_level_execute_command(): starting 40074 1727204620.51309: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543 `" && echo ansible-tmp-1727204620.5122268-40745-217214585884543="` echo /root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543 `" ) && sleep 0' 40074 1727204620.51908: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204620.51918: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204620.51930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204620.51976: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204620.51979: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204620.51987: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204620.51990: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.51998: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204620.52096: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204620.52100: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 40074 1727204620.52102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204620.52104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204620.52107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204620.52109: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204620.52111: stderr chunk (state=3): >>>debug2: match found <<< 40074 1727204620.52113: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.52197: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204620.52200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.52244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.54348: stdout chunk (state=3): >>>ansible-tmp-1727204620.5122268-40745-217214585884543=/root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543 <<< 40074 1727204620.54930: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.54934: stdout chunk (state=3): >>><<< 40074 1727204620.54936: stderr chunk (state=3): >>><<< 40074 1727204620.54939: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204620.5122268-40745-217214585884543=/root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204620.54941: variable 'ansible_module_compression' from source: unknown 40074 1727204620.54944: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204620.54946: variable 'ansible_facts' from source: unknown 40074 1727204620.55033: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543/AnsiballZ_command.py 40074 1727204620.55254: Sending initial data 40074 1727204620.55259: Sent initial data (156 bytes) 40074 1727204620.55922: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.55983: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204620.56005: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204620.56045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.56115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.57854: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 40074 1727204620.57859: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204620.57896: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204620.57954: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp4kglm16t /root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543/AnsiballZ_command.py <<< 40074 1727204620.57957: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543/AnsiballZ_command.py" <<< 40074 1727204620.58002: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp4kglm16t" to remote "/root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543/AnsiballZ_command.py" <<< 40074 1727204620.59169: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.59279: stderr chunk (state=3): >>><<< 40074 1727204620.59282: stdout chunk (state=3): >>><<< 40074 1727204620.59284: done transferring module to remote 40074 1727204620.59287: _low_level_execute_command(): starting 40074 1727204620.59291: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543/ /root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543/AnsiballZ_command.py && sleep 0' 40074 1727204620.59957: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204620.59975: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204620.59998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204620.60023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204620.60053: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204620.60067: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204620.60082: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.60107: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204620.60122: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204620.60210: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 40074 1727204620.60213: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.60279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.60350: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.62387: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.62393: stdout chunk (state=3): >>><<< 40074 1727204620.62405: stderr chunk (state=3): >>><<< 40074 1727204620.62428: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204620.62434: _low_level_execute_command(): starting 40074 1727204620.62437: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543/AnsiballZ_command.py && sleep 0' 40074 1727204620.63076: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204620.63087: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204620.63101: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204620.63117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204620.63294: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204620.63298: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204620.63301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.63303: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204620.63310: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204620.63312: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 40074 1727204620.63315: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.63317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204620.63319: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204620.63322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.63405: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.81646: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-24 15:03:40.811448", "end": "2024-09-24 15:03:40.815275", "delta": "0:00:00.003827", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 40074 1727204620.81729: stdout chunk (state=3): >>> <<< 40074 1727204620.83467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204620.83524: stderr chunk (state=3): >>><<< 40074 1727204620.83528: stdout chunk (state=3): >>><<< 40074 1727204620.83548: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "ethtest1", "up"], "start": "2024-09-24 15:03:40.811448", "end": "2024-09-24 15:03:40.815275", "delta": "0:00:00.003827", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set ethtest1 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204620.83576: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set ethtest1 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204620.83584: _low_level_execute_command(): starting 40074 1727204620.83592: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204620.5122268-40745-217214585884543/ > /dev/null 2>&1 && sleep 0' 40074 1727204620.84046: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204620.84050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.84052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204620.84055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.84109: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204620.84116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.84156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.86108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.86156: stderr chunk (state=3): >>><<< 40074 1727204620.86160: stdout chunk (state=3): >>><<< 40074 1727204620.86174: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204620.86180: handler run complete 40074 1727204620.86200: Evaluated conditional (False): False 40074 1727204620.86213: attempt loop complete, returning result 40074 1727204620.86236: variable 'item' from source: unknown 40074 1727204620.86303: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set ethtest1 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "ethtest1", "up" ], "delta": "0:00:00.003827", "end": "2024-09-24 15:03:40.815275", "item": "ip link set ethtest1 up", "rc": 0, "start": "2024-09-24 15:03:40.811448" } 40074 1727204620.86526: dumping result to json 40074 1727204620.86529: done dumping result, returning 40074 1727204620.86534: done running TaskExecutor() for managed-node2/TASK: Create veth interface ethtest1 [12b410aa-8751-9fd7-2501-000000000300] 40074 1727204620.86536: sending task result for task 12b410aa-8751-9fd7-2501-000000000300 40074 1727204620.86582: done sending task result for task 12b410aa-8751-9fd7-2501-000000000300 40074 1727204620.86585: WORKER PROCESS EXITING 40074 1727204620.86719: no more pending results, returning what we have 40074 1727204620.86723: results queue empty 40074 1727204620.86724: checking for any_errors_fatal 40074 1727204620.86730: done checking for any_errors_fatal 40074 1727204620.86733: checking for max_fail_percentage 40074 1727204620.86734: done checking for max_fail_percentage 40074 1727204620.86735: checking to see if all hosts have failed and the running result is not ok 40074 1727204620.86736: done checking to see if all hosts have failed 40074 1727204620.86737: getting the remaining hosts for this loop 40074 1727204620.86738: done getting the remaining hosts for this loop 40074 1727204620.86742: getting the next task for host managed-node2 40074 1727204620.86747: done getting next task for host managed-node2 40074 1727204620.86750: ^ task is: TASK: Set up veth as managed by NetworkManager 40074 1727204620.86752: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204620.86756: getting variables 40074 1727204620.86757: in VariableManager get_vars() 40074 1727204620.86783: Calling all_inventory to load vars for managed-node2 40074 1727204620.86785: Calling groups_inventory to load vars for managed-node2 40074 1727204620.86787: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204620.86797: Calling all_plugins_play to load vars for managed-node2 40074 1727204620.86799: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204620.86802: Calling groups_plugins_play to load vars for managed-node2 40074 1727204620.86961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204620.87159: done with get_vars() 40074 1727204620.87169: done getting variables 40074 1727204620.87220: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 15:03:40 -0400 (0:00:01.279) 0:00:14.634 ***** 40074 1727204620.87244: entering _queue_task() for managed-node2/command 40074 1727204620.87467: worker is 1 (out of 1 available) 40074 1727204620.87481: exiting _queue_task() for managed-node2/command 40074 1727204620.87496: done queuing things up, now waiting for results queue to drain 40074 1727204620.87498: waiting for pending results... 40074 1727204620.87665: running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager 40074 1727204620.87742: in run() - task 12b410aa-8751-9fd7-2501-000000000301 40074 1727204620.87755: variable 'ansible_search_path' from source: unknown 40074 1727204620.87760: variable 'ansible_search_path' from source: unknown 40074 1727204620.87794: calling self._execute() 40074 1727204620.87870: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204620.87877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204620.87887: variable 'omit' from source: magic vars 40074 1727204620.88196: variable 'ansible_distribution_major_version' from source: facts 40074 1727204620.88207: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204620.88339: variable 'type' from source: set_fact 40074 1727204620.88343: variable 'state' from source: include params 40074 1727204620.88350: Evaluated conditional (type == 'veth' and state == 'present'): True 40074 1727204620.88356: variable 'omit' from source: magic vars 40074 1727204620.88395: variable 'omit' from source: magic vars 40074 1727204620.88474: variable 'interface' from source: set_fact 40074 1727204620.88492: variable 'omit' from source: magic vars 40074 1727204620.88530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204620.88561: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204620.88615: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204620.88635: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204620.88645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204620.88672: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204620.88675: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204620.88680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204620.88771: Set connection var ansible_pipelining to False 40074 1727204620.88778: Set connection var ansible_shell_executable to /bin/sh 40074 1727204620.88781: Set connection var ansible_shell_type to sh 40074 1727204620.88784: Set connection var ansible_connection to ssh 40074 1727204620.88793: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204620.88800: Set connection var ansible_timeout to 10 40074 1727204620.88825: variable 'ansible_shell_executable' from source: unknown 40074 1727204620.88828: variable 'ansible_connection' from source: unknown 40074 1727204620.88834: variable 'ansible_module_compression' from source: unknown 40074 1727204620.88836: variable 'ansible_shell_type' from source: unknown 40074 1727204620.88839: variable 'ansible_shell_executable' from source: unknown 40074 1727204620.88841: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204620.88845: variable 'ansible_pipelining' from source: unknown 40074 1727204620.88850: variable 'ansible_timeout' from source: unknown 40074 1727204620.88855: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204620.88973: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204620.88983: variable 'omit' from source: magic vars 40074 1727204620.88991: starting attempt loop 40074 1727204620.88995: running the handler 40074 1727204620.89009: _low_level_execute_command(): starting 40074 1727204620.89016: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204620.89560: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204620.89565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.89569: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204620.89571: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.89630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204620.89634: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.89683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.91451: stdout chunk (state=3): >>>/root <<< 40074 1727204620.91558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.91615: stderr chunk (state=3): >>><<< 40074 1727204620.91619: stdout chunk (state=3): >>><<< 40074 1727204620.91642: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204620.91654: _low_level_execute_command(): starting 40074 1727204620.91659: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121 `" && echo ansible-tmp-1727204620.9164119-40790-84613625235121="` echo /root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121 `" ) && sleep 0' 40074 1727204620.92122: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204620.92126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.92140: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204620.92144: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.92187: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204620.92193: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.92240: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.94285: stdout chunk (state=3): >>>ansible-tmp-1727204620.9164119-40790-84613625235121=/root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121 <<< 40074 1727204620.94406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.94456: stderr chunk (state=3): >>><<< 40074 1727204620.94459: stdout chunk (state=3): >>><<< 40074 1727204620.94474: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204620.9164119-40790-84613625235121=/root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204620.94504: variable 'ansible_module_compression' from source: unknown 40074 1727204620.94548: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204620.94577: variable 'ansible_facts' from source: unknown 40074 1727204620.94644: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121/AnsiballZ_command.py 40074 1727204620.94751: Sending initial data 40074 1727204620.94755: Sent initial data (155 bytes) 40074 1727204620.95215: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204620.95218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204620.95221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204620.95223: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.95279: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204620.95286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.95325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204620.97015: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204620.97048: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204620.97083: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpa99clsp6 /root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121/AnsiballZ_command.py <<< 40074 1727204620.97092: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121/AnsiballZ_command.py" <<< 40074 1727204620.97119: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpa99clsp6" to remote "/root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121/AnsiballZ_command.py" <<< 40074 1727204620.97894: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204620.97962: stderr chunk (state=3): >>><<< 40074 1727204620.97966: stdout chunk (state=3): >>><<< 40074 1727204620.97986: done transferring module to remote 40074 1727204620.97998: _low_level_execute_command(): starting 40074 1727204620.98003: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121/ /root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121/AnsiballZ_command.py && sleep 0' 40074 1727204620.98438: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204620.98442: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.98454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204620.98467: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204620.98514: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204620.98531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204620.98567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204621.00485: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204621.00533: stderr chunk (state=3): >>><<< 40074 1727204621.00538: stdout chunk (state=3): >>><<< 40074 1727204621.00552: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204621.00556: _low_level_execute_command(): starting 40074 1727204621.00562: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121/AnsiballZ_command.py && sleep 0' 40074 1727204621.01000: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204621.01004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204621.01006: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204621.01009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204621.01062: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204621.01069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204621.01118: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204621.21344: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-24 15:03:41.186307", "end": "2024-09-24 15:03:41.209147", "delta": "0:00:00.022840", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204621.23475: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204621.23538: stderr chunk (state=3): >>><<< 40074 1727204621.23542: stdout chunk (state=3): >>><<< 40074 1727204621.23560: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "ethtest1", "managed", "true"], "start": "2024-09-24 15:03:41.186307", "end": "2024-09-24 15:03:41.209147", "delta": "0:00:00.022840", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set ethtest1 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204621.23602: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set ethtest1 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204621.23615: _low_level_execute_command(): starting 40074 1727204621.23620: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204620.9164119-40790-84613625235121/ > /dev/null 2>&1 && sleep 0' 40074 1727204621.24096: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204621.24100: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204621.24105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204621.24116: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204621.24175: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204621.24179: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204621.24224: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204621.26295: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204621.26299: stdout chunk (state=3): >>><<< 40074 1727204621.26301: stderr chunk (state=3): >>><<< 40074 1727204621.26304: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204621.26306: handler run complete 40074 1727204621.26329: Evaluated conditional (False): False 40074 1727204621.26349: attempt loop complete, returning result 40074 1727204621.26356: _execute() done 40074 1727204621.26365: dumping result to json 40074 1727204621.26421: done dumping result, returning 40074 1727204621.26424: done running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager [12b410aa-8751-9fd7-2501-000000000301] 40074 1727204621.26427: sending task result for task 12b410aa-8751-9fd7-2501-000000000301 ok: [managed-node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "ethtest1", "managed", "true" ], "delta": "0:00:00.022840", "end": "2024-09-24 15:03:41.209147", "rc": 0, "start": "2024-09-24 15:03:41.186307" } 40074 1727204621.26776: no more pending results, returning what we have 40074 1727204621.26780: results queue empty 40074 1727204621.26781: checking for any_errors_fatal 40074 1727204621.26801: done checking for any_errors_fatal 40074 1727204621.26802: checking for max_fail_percentage 40074 1727204621.26804: done checking for max_fail_percentage 40074 1727204621.26805: checking to see if all hosts have failed and the running result is not ok 40074 1727204621.26807: done checking to see if all hosts have failed 40074 1727204621.26808: getting the remaining hosts for this loop 40074 1727204621.26809: done getting the remaining hosts for this loop 40074 1727204621.26814: getting the next task for host managed-node2 40074 1727204621.26820: done getting next task for host managed-node2 40074 1727204621.26824: ^ task is: TASK: Delete veth interface {{ interface }} 40074 1727204621.26828: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204621.26832: getting variables 40074 1727204621.26834: in VariableManager get_vars() 40074 1727204621.26880: Calling all_inventory to load vars for managed-node2 40074 1727204621.26883: Calling groups_inventory to load vars for managed-node2 40074 1727204621.26886: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.27009: done sending task result for task 12b410aa-8751-9fd7-2501-000000000301 40074 1727204621.27013: WORKER PROCESS EXITING 40074 1727204621.27024: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.27028: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.27034: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.27403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204621.27784: done with get_vars() 40074 1727204621.27799: done getting variables 40074 1727204621.27866: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204621.28012: variable 'interface' from source: set_fact TASK [Delete veth interface ethtest1] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.407) 0:00:15.042 ***** 40074 1727204621.28047: entering _queue_task() for managed-node2/command 40074 1727204621.28441: worker is 1 (out of 1 available) 40074 1727204621.28454: exiting _queue_task() for managed-node2/command 40074 1727204621.28464: done queuing things up, now waiting for results queue to drain 40074 1727204621.28466: waiting for pending results... 40074 1727204621.28654: running TaskExecutor() for managed-node2/TASK: Delete veth interface ethtest1 40074 1727204621.28784: in run() - task 12b410aa-8751-9fd7-2501-000000000302 40074 1727204621.28812: variable 'ansible_search_path' from source: unknown 40074 1727204621.28821: variable 'ansible_search_path' from source: unknown 40074 1727204621.28867: calling self._execute() 40074 1727204621.28977: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.28993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.29011: variable 'omit' from source: magic vars 40074 1727204621.29442: variable 'ansible_distribution_major_version' from source: facts 40074 1727204621.29465: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204621.29752: variable 'type' from source: set_fact 40074 1727204621.29763: variable 'state' from source: include params 40074 1727204621.29776: variable 'interface' from source: set_fact 40074 1727204621.29786: variable 'current_interfaces' from source: set_fact 40074 1727204621.29801: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 40074 1727204621.29810: when evaluation is False, skipping this task 40074 1727204621.29817: _execute() done 40074 1727204621.29825: dumping result to json 40074 1727204621.29833: done dumping result, returning 40074 1727204621.29850: done running TaskExecutor() for managed-node2/TASK: Delete veth interface ethtest1 [12b410aa-8751-9fd7-2501-000000000302] 40074 1727204621.29859: sending task result for task 12b410aa-8751-9fd7-2501-000000000302 40074 1727204621.30084: done sending task result for task 12b410aa-8751-9fd7-2501-000000000302 40074 1727204621.30087: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 40074 1727204621.30141: no more pending results, returning what we have 40074 1727204621.30145: results queue empty 40074 1727204621.30146: checking for any_errors_fatal 40074 1727204621.30158: done checking for any_errors_fatal 40074 1727204621.30159: checking for max_fail_percentage 40074 1727204621.30161: done checking for max_fail_percentage 40074 1727204621.30162: checking to see if all hosts have failed and the running result is not ok 40074 1727204621.30168: done checking to see if all hosts have failed 40074 1727204621.30169: getting the remaining hosts for this loop 40074 1727204621.30171: done getting the remaining hosts for this loop 40074 1727204621.30175: getting the next task for host managed-node2 40074 1727204621.30182: done getting next task for host managed-node2 40074 1727204621.30185: ^ task is: TASK: Create dummy interface {{ interface }} 40074 1727204621.30191: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204621.30196: getting variables 40074 1727204621.30198: in VariableManager get_vars() 40074 1727204621.30242: Calling all_inventory to load vars for managed-node2 40074 1727204621.30245: Calling groups_inventory to load vars for managed-node2 40074 1727204621.30248: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.30262: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.30266: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.30270: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.30661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204621.30981: done with get_vars() 40074 1727204621.30996: done getting variables 40074 1727204621.31075: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204621.31216: variable 'interface' from source: set_fact TASK [Create dummy interface ethtest1] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.031) 0:00:15.074 ***** 40074 1727204621.31249: entering _queue_task() for managed-node2/command 40074 1727204621.31633: worker is 1 (out of 1 available) 40074 1727204621.31650: exiting _queue_task() for managed-node2/command 40074 1727204621.31664: done queuing things up, now waiting for results queue to drain 40074 1727204621.31666: waiting for pending results... 40074 1727204621.32007: running TaskExecutor() for managed-node2/TASK: Create dummy interface ethtest1 40074 1727204621.32087: in run() - task 12b410aa-8751-9fd7-2501-000000000303 40074 1727204621.32114: variable 'ansible_search_path' from source: unknown 40074 1727204621.32124: variable 'ansible_search_path' from source: unknown 40074 1727204621.32178: calling self._execute() 40074 1727204621.32292: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.32307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.32365: variable 'omit' from source: magic vars 40074 1727204621.32762: variable 'ansible_distribution_major_version' from source: facts 40074 1727204621.32783: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204621.33071: variable 'type' from source: set_fact 40074 1727204621.33084: variable 'state' from source: include params 40074 1727204621.33098: variable 'interface' from source: set_fact 40074 1727204621.33128: variable 'current_interfaces' from source: set_fact 40074 1727204621.33132: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 40074 1727204621.33138: when evaluation is False, skipping this task 40074 1727204621.33237: _execute() done 40074 1727204621.33241: dumping result to json 40074 1727204621.33244: done dumping result, returning 40074 1727204621.33247: done running TaskExecutor() for managed-node2/TASK: Create dummy interface ethtest1 [12b410aa-8751-9fd7-2501-000000000303] 40074 1727204621.33249: sending task result for task 12b410aa-8751-9fd7-2501-000000000303 40074 1727204621.33319: done sending task result for task 12b410aa-8751-9fd7-2501-000000000303 40074 1727204621.33323: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 40074 1727204621.33395: no more pending results, returning what we have 40074 1727204621.33400: results queue empty 40074 1727204621.33401: checking for any_errors_fatal 40074 1727204621.33410: done checking for any_errors_fatal 40074 1727204621.33411: checking for max_fail_percentage 40074 1727204621.33413: done checking for max_fail_percentage 40074 1727204621.33414: checking to see if all hosts have failed and the running result is not ok 40074 1727204621.33416: done checking to see if all hosts have failed 40074 1727204621.33417: getting the remaining hosts for this loop 40074 1727204621.33418: done getting the remaining hosts for this loop 40074 1727204621.33422: getting the next task for host managed-node2 40074 1727204621.33429: done getting next task for host managed-node2 40074 1727204621.33431: ^ task is: TASK: Delete dummy interface {{ interface }} 40074 1727204621.33435: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204621.33440: getting variables 40074 1727204621.33441: in VariableManager get_vars() 40074 1727204621.33708: Calling all_inventory to load vars for managed-node2 40074 1727204621.33712: Calling groups_inventory to load vars for managed-node2 40074 1727204621.33715: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.33729: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.33733: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.33737: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.35215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204621.35920: done with get_vars() 40074 1727204621.35932: done getting variables 40074 1727204621.36096: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204621.36293: variable 'interface' from source: set_fact TASK [Delete dummy interface ethtest1] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.050) 0:00:15.125 ***** 40074 1727204621.36327: entering _queue_task() for managed-node2/command 40074 1727204621.37013: worker is 1 (out of 1 available) 40074 1727204621.37026: exiting _queue_task() for managed-node2/command 40074 1727204621.37038: done queuing things up, now waiting for results queue to drain 40074 1727204621.37040: waiting for pending results... 40074 1727204621.37433: running TaskExecutor() for managed-node2/TASK: Delete dummy interface ethtest1 40074 1727204621.37658: in run() - task 12b410aa-8751-9fd7-2501-000000000304 40074 1727204621.37749: variable 'ansible_search_path' from source: unknown 40074 1727204621.37754: variable 'ansible_search_path' from source: unknown 40074 1727204621.37757: calling self._execute() 40074 1727204621.38055: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.38064: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.38077: variable 'omit' from source: magic vars 40074 1727204621.39076: variable 'ansible_distribution_major_version' from source: facts 40074 1727204621.39098: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204621.39606: variable 'type' from source: set_fact 40074 1727204621.39797: variable 'state' from source: include params 40074 1727204621.39801: variable 'interface' from source: set_fact 40074 1727204621.39804: variable 'current_interfaces' from source: set_fact 40074 1727204621.39807: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 40074 1727204621.39809: when evaluation is False, skipping this task 40074 1727204621.39812: _execute() done 40074 1727204621.39814: dumping result to json 40074 1727204621.39816: done dumping result, returning 40074 1727204621.39818: done running TaskExecutor() for managed-node2/TASK: Delete dummy interface ethtest1 [12b410aa-8751-9fd7-2501-000000000304] 40074 1727204621.39820: sending task result for task 12b410aa-8751-9fd7-2501-000000000304 skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 40074 1727204621.40116: no more pending results, returning what we have 40074 1727204621.40120: results queue empty 40074 1727204621.40122: checking for any_errors_fatal 40074 1727204621.40131: done checking for any_errors_fatal 40074 1727204621.40132: checking for max_fail_percentage 40074 1727204621.40134: done checking for max_fail_percentage 40074 1727204621.40135: checking to see if all hosts have failed and the running result is not ok 40074 1727204621.40137: done checking to see if all hosts have failed 40074 1727204621.40138: getting the remaining hosts for this loop 40074 1727204621.40139: done getting the remaining hosts for this loop 40074 1727204621.40143: getting the next task for host managed-node2 40074 1727204621.40151: done getting next task for host managed-node2 40074 1727204621.40154: ^ task is: TASK: Create tap interface {{ interface }} 40074 1727204621.40158: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204621.40164: getting variables 40074 1727204621.40168: in VariableManager get_vars() 40074 1727204621.40218: Calling all_inventory to load vars for managed-node2 40074 1727204621.40222: Calling groups_inventory to load vars for managed-node2 40074 1727204621.40225: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.40241: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.40245: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.40250: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.40774: done sending task result for task 12b410aa-8751-9fd7-2501-000000000304 40074 1727204621.40778: WORKER PROCESS EXITING 40074 1727204621.40799: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204621.41213: done with get_vars() 40074 1727204621.41224: done getting variables 40074 1727204621.41274: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204621.41367: variable 'interface' from source: set_fact TASK [Create tap interface ethtest1] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.050) 0:00:15.175 ***** 40074 1727204621.41393: entering _queue_task() for managed-node2/command 40074 1727204621.41597: worker is 1 (out of 1 available) 40074 1727204621.41612: exiting _queue_task() for managed-node2/command 40074 1727204621.41626: done queuing things up, now waiting for results queue to drain 40074 1727204621.41627: waiting for pending results... 40074 1727204621.41796: running TaskExecutor() for managed-node2/TASK: Create tap interface ethtest1 40074 1727204621.41879: in run() - task 12b410aa-8751-9fd7-2501-000000000305 40074 1727204621.41895: variable 'ansible_search_path' from source: unknown 40074 1727204621.41899: variable 'ansible_search_path' from source: unknown 40074 1727204621.41935: calling self._execute() 40074 1727204621.42016: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.42023: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.42036: variable 'omit' from source: magic vars 40074 1727204621.42364: variable 'ansible_distribution_major_version' from source: facts 40074 1727204621.42379: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204621.42681: variable 'type' from source: set_fact 40074 1727204621.42685: variable 'state' from source: include params 40074 1727204621.42690: variable 'interface' from source: set_fact 40074 1727204621.42721: variable 'current_interfaces' from source: set_fact 40074 1727204621.42725: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 40074 1727204621.42728: when evaluation is False, skipping this task 40074 1727204621.42730: _execute() done 40074 1727204621.42733: dumping result to json 40074 1727204621.42735: done dumping result, returning 40074 1727204621.42738: done running TaskExecutor() for managed-node2/TASK: Create tap interface ethtest1 [12b410aa-8751-9fd7-2501-000000000305] 40074 1727204621.42740: sending task result for task 12b410aa-8751-9fd7-2501-000000000305 40074 1727204621.42807: done sending task result for task 12b410aa-8751-9fd7-2501-000000000305 40074 1727204621.42810: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 40074 1727204621.42868: no more pending results, returning what we have 40074 1727204621.42872: results queue empty 40074 1727204621.42873: checking for any_errors_fatal 40074 1727204621.42877: done checking for any_errors_fatal 40074 1727204621.42878: checking for max_fail_percentage 40074 1727204621.42879: done checking for max_fail_percentage 40074 1727204621.42880: checking to see if all hosts have failed and the running result is not ok 40074 1727204621.42882: done checking to see if all hosts have failed 40074 1727204621.42883: getting the remaining hosts for this loop 40074 1727204621.42884: done getting the remaining hosts for this loop 40074 1727204621.42887: getting the next task for host managed-node2 40074 1727204621.42894: done getting next task for host managed-node2 40074 1727204621.42897: ^ task is: TASK: Delete tap interface {{ interface }} 40074 1727204621.42900: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204621.42905: getting variables 40074 1727204621.42906: in VariableManager get_vars() 40074 1727204621.42942: Calling all_inventory to load vars for managed-node2 40074 1727204621.42945: Calling groups_inventory to load vars for managed-node2 40074 1727204621.42948: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.42958: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.42961: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.42964: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.43248: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204621.43626: done with get_vars() 40074 1727204621.43639: done getting variables 40074 1727204621.43713: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204621.43842: variable 'interface' from source: set_fact TASK [Delete tap interface ethtest1] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.024) 0:00:15.200 ***** 40074 1727204621.43879: entering _queue_task() for managed-node2/command 40074 1727204621.44074: worker is 1 (out of 1 available) 40074 1727204621.44090: exiting _queue_task() for managed-node2/command 40074 1727204621.44103: done queuing things up, now waiting for results queue to drain 40074 1727204621.44104: waiting for pending results... 40074 1727204621.44271: running TaskExecutor() for managed-node2/TASK: Delete tap interface ethtest1 40074 1727204621.44349: in run() - task 12b410aa-8751-9fd7-2501-000000000306 40074 1727204621.44362: variable 'ansible_search_path' from source: unknown 40074 1727204621.44367: variable 'ansible_search_path' from source: unknown 40074 1727204621.44399: calling self._execute() 40074 1727204621.44476: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.44483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.44495: variable 'omit' from source: magic vars 40074 1727204621.44787: variable 'ansible_distribution_major_version' from source: facts 40074 1727204621.44799: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204621.44972: variable 'type' from source: set_fact 40074 1727204621.44976: variable 'state' from source: include params 40074 1727204621.44986: variable 'interface' from source: set_fact 40074 1727204621.44999: variable 'current_interfaces' from source: set_fact 40074 1727204621.45004: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 40074 1727204621.45007: when evaluation is False, skipping this task 40074 1727204621.45009: _execute() done 40074 1727204621.45012: dumping result to json 40074 1727204621.45014: done dumping result, returning 40074 1727204621.45019: done running TaskExecutor() for managed-node2/TASK: Delete tap interface ethtest1 [12b410aa-8751-9fd7-2501-000000000306] 40074 1727204621.45025: sending task result for task 12b410aa-8751-9fd7-2501-000000000306 40074 1727204621.45114: done sending task result for task 12b410aa-8751-9fd7-2501-000000000306 40074 1727204621.45118: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 40074 1727204621.45165: no more pending results, returning what we have 40074 1727204621.45169: results queue empty 40074 1727204621.45170: checking for any_errors_fatal 40074 1727204621.45174: done checking for any_errors_fatal 40074 1727204621.45175: checking for max_fail_percentage 40074 1727204621.45177: done checking for max_fail_percentage 40074 1727204621.45178: checking to see if all hosts have failed and the running result is not ok 40074 1727204621.45179: done checking to see if all hosts have failed 40074 1727204621.45180: getting the remaining hosts for this loop 40074 1727204621.45181: done getting the remaining hosts for this loop 40074 1727204621.45185: getting the next task for host managed-node2 40074 1727204621.45193: done getting next task for host managed-node2 40074 1727204621.45197: ^ task is: TASK: Assert device is present 40074 1727204621.45199: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204621.45202: getting variables 40074 1727204621.45203: in VariableManager get_vars() 40074 1727204621.45239: Calling all_inventory to load vars for managed-node2 40074 1727204621.45242: Calling groups_inventory to load vars for managed-node2 40074 1727204621.45245: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.45254: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.45256: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.45259: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.45457: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204621.45651: done with get_vars() 40074 1727204621.45659: done getting variables TASK [Assert device is present] ************************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:32 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.018) 0:00:15.219 ***** 40074 1727204621.45732: entering _queue_task() for managed-node2/include_tasks 40074 1727204621.45917: worker is 1 (out of 1 available) 40074 1727204621.45932: exiting _queue_task() for managed-node2/include_tasks 40074 1727204621.45946: done queuing things up, now waiting for results queue to drain 40074 1727204621.45947: waiting for pending results... 40074 1727204621.46207: running TaskExecutor() for managed-node2/TASK: Assert device is present 40074 1727204621.46254: in run() - task 12b410aa-8751-9fd7-2501-000000000012 40074 1727204621.46297: variable 'ansible_search_path' from source: unknown 40074 1727204621.46322: calling self._execute() 40074 1727204621.46435: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.46579: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.46583: variable 'omit' from source: magic vars 40074 1727204621.46939: variable 'ansible_distribution_major_version' from source: facts 40074 1727204621.46961: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204621.47081: _execute() done 40074 1727204621.47084: dumping result to json 40074 1727204621.47087: done dumping result, returning 40074 1727204621.47092: done running TaskExecutor() for managed-node2/TASK: Assert device is present [12b410aa-8751-9fd7-2501-000000000012] 40074 1727204621.47094: sending task result for task 12b410aa-8751-9fd7-2501-000000000012 40074 1727204621.47162: done sending task result for task 12b410aa-8751-9fd7-2501-000000000012 40074 1727204621.47165: WORKER PROCESS EXITING 40074 1727204621.47216: no more pending results, returning what we have 40074 1727204621.47222: in VariableManager get_vars() 40074 1727204621.47316: Calling all_inventory to load vars for managed-node2 40074 1727204621.47320: Calling groups_inventory to load vars for managed-node2 40074 1727204621.47323: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.47339: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.47343: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.47347: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.47643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204621.47852: done with get_vars() 40074 1727204621.47859: variable 'ansible_search_path' from source: unknown 40074 1727204621.47868: we have included files to process 40074 1727204621.47869: generating all_blocks data 40074 1727204621.47870: done generating all_blocks data 40074 1727204621.47874: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 40074 1727204621.47875: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 40074 1727204621.47877: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 40074 1727204621.47966: in VariableManager get_vars() 40074 1727204621.47984: done with get_vars() 40074 1727204621.48075: done processing included file 40074 1727204621.48077: iterating over new_blocks loaded from include file 40074 1727204621.48078: in VariableManager get_vars() 40074 1727204621.48094: done with get_vars() 40074 1727204621.48095: filtering new block on tags 40074 1727204621.48110: done filtering new block on tags 40074 1727204621.48112: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 40074 1727204621.48115: extending task lists for all hosts with included blocks 40074 1727204621.49100: done extending task lists 40074 1727204621.49101: done processing included files 40074 1727204621.49101: results queue empty 40074 1727204621.49102: checking for any_errors_fatal 40074 1727204621.49104: done checking for any_errors_fatal 40074 1727204621.49105: checking for max_fail_percentage 40074 1727204621.49106: done checking for max_fail_percentage 40074 1727204621.49106: checking to see if all hosts have failed and the running result is not ok 40074 1727204621.49107: done checking to see if all hosts have failed 40074 1727204621.49107: getting the remaining hosts for this loop 40074 1727204621.49108: done getting the remaining hosts for this loop 40074 1727204621.49110: getting the next task for host managed-node2 40074 1727204621.49113: done getting next task for host managed-node2 40074 1727204621.49114: ^ task is: TASK: Include the task 'get_interface_stat.yml' 40074 1727204621.49116: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204621.49118: getting variables 40074 1727204621.49118: in VariableManager get_vars() 40074 1727204621.49131: Calling all_inventory to load vars for managed-node2 40074 1727204621.49132: Calling groups_inventory to load vars for managed-node2 40074 1727204621.49135: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.49140: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.49142: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.49145: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.49377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204621.49739: done with get_vars() 40074 1727204621.49751: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.040) 0:00:15.260 ***** 40074 1727204621.49836: entering _queue_task() for managed-node2/include_tasks 40074 1727204621.50117: worker is 1 (out of 1 available) 40074 1727204621.50134: exiting _queue_task() for managed-node2/include_tasks 40074 1727204621.50148: done queuing things up, now waiting for results queue to drain 40074 1727204621.50150: waiting for pending results... 40074 1727204621.50607: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 40074 1727204621.50612: in run() - task 12b410aa-8751-9fd7-2501-0000000003eb 40074 1727204621.50615: variable 'ansible_search_path' from source: unknown 40074 1727204621.50622: variable 'ansible_search_path' from source: unknown 40074 1727204621.50628: calling self._execute() 40074 1727204621.50731: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.50745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.50766: variable 'omit' from source: magic vars 40074 1727204621.51198: variable 'ansible_distribution_major_version' from source: facts 40074 1727204621.51219: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204621.51236: _execute() done 40074 1727204621.51247: dumping result to json 40074 1727204621.51256: done dumping result, returning 40074 1727204621.51269: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-9fd7-2501-0000000003eb] 40074 1727204621.51281: sending task result for task 12b410aa-8751-9fd7-2501-0000000003eb 40074 1727204621.51416: no more pending results, returning what we have 40074 1727204621.51421: in VariableManager get_vars() 40074 1727204621.51476: Calling all_inventory to load vars for managed-node2 40074 1727204621.51480: Calling groups_inventory to load vars for managed-node2 40074 1727204621.51484: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.51511: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.51516: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.51521: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.51803: done sending task result for task 12b410aa-8751-9fd7-2501-0000000003eb 40074 1727204621.51807: WORKER PROCESS EXITING 40074 1727204621.51819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204621.52035: done with get_vars() 40074 1727204621.52043: variable 'ansible_search_path' from source: unknown 40074 1727204621.52044: variable 'ansible_search_path' from source: unknown 40074 1727204621.52073: we have included files to process 40074 1727204621.52074: generating all_blocks data 40074 1727204621.52075: done generating all_blocks data 40074 1727204621.52076: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 40074 1727204621.52077: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 40074 1727204621.52078: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 40074 1727204621.52223: done processing included file 40074 1727204621.52226: iterating over new_blocks loaded from include file 40074 1727204621.52228: in VariableManager get_vars() 40074 1727204621.52242: done with get_vars() 40074 1727204621.52243: filtering new block on tags 40074 1727204621.52255: done filtering new block on tags 40074 1727204621.52257: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 40074 1727204621.52262: extending task lists for all hosts with included blocks 40074 1727204621.52344: done extending task lists 40074 1727204621.52345: done processing included files 40074 1727204621.52346: results queue empty 40074 1727204621.52346: checking for any_errors_fatal 40074 1727204621.52349: done checking for any_errors_fatal 40074 1727204621.52349: checking for max_fail_percentage 40074 1727204621.52350: done checking for max_fail_percentage 40074 1727204621.52351: checking to see if all hosts have failed and the running result is not ok 40074 1727204621.52351: done checking to see if all hosts have failed 40074 1727204621.52352: getting the remaining hosts for this loop 40074 1727204621.52353: done getting the remaining hosts for this loop 40074 1727204621.52355: getting the next task for host managed-node2 40074 1727204621.52359: done getting next task for host managed-node2 40074 1727204621.52360: ^ task is: TASK: Get stat for interface {{ interface }} 40074 1727204621.52362: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204621.52365: getting variables 40074 1727204621.52365: in VariableManager get_vars() 40074 1727204621.52377: Calling all_inventory to load vars for managed-node2 40074 1727204621.52380: Calling groups_inventory to load vars for managed-node2 40074 1727204621.52382: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.52386: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.52388: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.52392: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.52530: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204621.52721: done with get_vars() 40074 1727204621.52731: done getting variables 40074 1727204621.52858: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.030) 0:00:15.290 ***** 40074 1727204621.52881: entering _queue_task() for managed-node2/stat 40074 1727204621.53101: worker is 1 (out of 1 available) 40074 1727204621.53116: exiting _queue_task() for managed-node2/stat 40074 1727204621.53132: done queuing things up, now waiting for results queue to drain 40074 1727204621.53134: waiting for pending results... 40074 1727204621.53308: running TaskExecutor() for managed-node2/TASK: Get stat for interface ethtest1 40074 1727204621.53384: in run() - task 12b410aa-8751-9fd7-2501-000000000483 40074 1727204621.53399: variable 'ansible_search_path' from source: unknown 40074 1727204621.53402: variable 'ansible_search_path' from source: unknown 40074 1727204621.53476: calling self._execute() 40074 1727204621.53584: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.53589: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.53691: variable 'omit' from source: magic vars 40074 1727204621.54188: variable 'ansible_distribution_major_version' from source: facts 40074 1727204621.54213: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204621.54229: variable 'omit' from source: magic vars 40074 1727204621.54302: variable 'omit' from source: magic vars 40074 1727204621.54448: variable 'interface' from source: set_fact 40074 1727204621.54484: variable 'omit' from source: magic vars 40074 1727204621.54551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204621.54611: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204621.54632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204621.54653: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204621.54664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204621.54695: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204621.54699: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.54703: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.54801: Set connection var ansible_pipelining to False 40074 1727204621.54808: Set connection var ansible_shell_executable to /bin/sh 40074 1727204621.54811: Set connection var ansible_shell_type to sh 40074 1727204621.54814: Set connection var ansible_connection to ssh 40074 1727204621.54821: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204621.54829: Set connection var ansible_timeout to 10 40074 1727204621.54852: variable 'ansible_shell_executable' from source: unknown 40074 1727204621.54856: variable 'ansible_connection' from source: unknown 40074 1727204621.54859: variable 'ansible_module_compression' from source: unknown 40074 1727204621.54862: variable 'ansible_shell_type' from source: unknown 40074 1727204621.54864: variable 'ansible_shell_executable' from source: unknown 40074 1727204621.54867: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.54873: variable 'ansible_pipelining' from source: unknown 40074 1727204621.54876: variable 'ansible_timeout' from source: unknown 40074 1727204621.54882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.55053: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204621.55064: variable 'omit' from source: magic vars 40074 1727204621.55071: starting attempt loop 40074 1727204621.55074: running the handler 40074 1727204621.55092: _low_level_execute_command(): starting 40074 1727204621.55099: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204621.55630: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204621.55635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204621.55638: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204621.55641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204621.55692: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204621.55696: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204621.55748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204621.57511: stdout chunk (state=3): >>>/root <<< 40074 1727204621.57628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204621.57683: stderr chunk (state=3): >>><<< 40074 1727204621.57687: stdout chunk (state=3): >>><<< 40074 1727204621.57710: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204621.57723: _low_level_execute_command(): starting 40074 1727204621.57732: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190 `" && echo ansible-tmp-1727204621.5770907-40819-195470143076190="` echo /root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190 `" ) && sleep 0' 40074 1727204621.58207: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204621.58211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204621.58214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204621.58226: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204621.58229: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204621.58272: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204621.58276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204621.58322: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204621.60384: stdout chunk (state=3): >>>ansible-tmp-1727204621.5770907-40819-195470143076190=/root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190 <<< 40074 1727204621.60503: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204621.60552: stderr chunk (state=3): >>><<< 40074 1727204621.60556: stdout chunk (state=3): >>><<< 40074 1727204621.60571: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204621.5770907-40819-195470143076190=/root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204621.60616: variable 'ansible_module_compression' from source: unknown 40074 1727204621.60666: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 40074 1727204621.60700: variable 'ansible_facts' from source: unknown 40074 1727204621.60771: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190/AnsiballZ_stat.py 40074 1727204621.60888: Sending initial data 40074 1727204621.60895: Sent initial data (153 bytes) 40074 1727204621.61363: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204621.61367: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204621.61371: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204621.61374: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204621.61376: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204621.61429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204621.61434: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204621.61479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204621.63162: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 40074 1727204621.63167: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204621.63199: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204621.63238: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpt0msl8jb /root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190/AnsiballZ_stat.py <<< 40074 1727204621.63244: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190/AnsiballZ_stat.py" <<< 40074 1727204621.63269: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpt0msl8jb" to remote "/root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190/AnsiballZ_stat.py" <<< 40074 1727204621.64049: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204621.64116: stderr chunk (state=3): >>><<< 40074 1727204621.64119: stdout chunk (state=3): >>><<< 40074 1727204621.64140: done transferring module to remote 40074 1727204621.64150: _low_level_execute_command(): starting 40074 1727204621.64155: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190/ /root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190/AnsiballZ_stat.py && sleep 0' 40074 1727204621.64591: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204621.64633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204621.64636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204621.64639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204621.64642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204621.64644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204621.64695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204621.64699: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204621.64746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204621.66656: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204621.66706: stderr chunk (state=3): >>><<< 40074 1727204621.66709: stdout chunk (state=3): >>><<< 40074 1727204621.66727: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204621.66731: _low_level_execute_command(): starting 40074 1727204621.66735: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190/AnsiballZ_stat.py && sleep 0' 40074 1727204621.67168: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204621.67206: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204621.67211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204621.67214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 40074 1727204621.67216: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204621.67218: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204621.67269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204621.67273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204621.67324: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204621.84895: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 39995, "dev": 23, "nlink": 1, "atime": 1727204619.9655643, "mtime": 1727204619.9655643, "ctime": 1727204619.9655643, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 40074 1727204621.86413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204621.86495: stderr chunk (state=3): >>><<< 40074 1727204621.86498: stdout chunk (state=3): >>><<< 40074 1727204621.86697: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/ethtest1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 39995, "dev": 23, "nlink": 1, "atime": 1727204619.9655643, "mtime": 1727204619.9655643, "ctime": 1727204619.9655643, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204621.86702: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204621.86706: _low_level_execute_command(): starting 40074 1727204621.86708: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204621.5770907-40819-195470143076190/ > /dev/null 2>&1 && sleep 0' 40074 1727204621.87389: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204621.87473: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204621.87529: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204621.87533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204621.87615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204621.89676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204621.89680: stdout chunk (state=3): >>><<< 40074 1727204621.89697: stderr chunk (state=3): >>><<< 40074 1727204621.89895: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204621.89898: handler run complete 40074 1727204621.89901: attempt loop complete, returning result 40074 1727204621.89904: _execute() done 40074 1727204621.89906: dumping result to json 40074 1727204621.89908: done dumping result, returning 40074 1727204621.89911: done running TaskExecutor() for managed-node2/TASK: Get stat for interface ethtest1 [12b410aa-8751-9fd7-2501-000000000483] 40074 1727204621.89913: sending task result for task 12b410aa-8751-9fd7-2501-000000000483 ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204619.9655643, "block_size": 4096, "blocks": 0, "ctime": 1727204619.9655643, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 39995, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/ethtest1", "lnk_target": "../../devices/virtual/net/ethtest1", "mode": "0777", "mtime": 1727204619.9655643, "nlink": 1, "path": "/sys/class/net/ethtest1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 40074 1727204621.90132: no more pending results, returning what we have 40074 1727204621.90136: results queue empty 40074 1727204621.90137: checking for any_errors_fatal 40074 1727204621.90139: done checking for any_errors_fatal 40074 1727204621.90140: checking for max_fail_percentage 40074 1727204621.90142: done checking for max_fail_percentage 40074 1727204621.90143: checking to see if all hosts have failed and the running result is not ok 40074 1727204621.90145: done checking to see if all hosts have failed 40074 1727204621.90146: getting the remaining hosts for this loop 40074 1727204621.90147: done getting the remaining hosts for this loop 40074 1727204621.90152: getting the next task for host managed-node2 40074 1727204621.90162: done getting next task for host managed-node2 40074 1727204621.90165: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 40074 1727204621.90168: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204621.90173: getting variables 40074 1727204621.90175: in VariableManager get_vars() 40074 1727204621.90503: Calling all_inventory to load vars for managed-node2 40074 1727204621.90508: Calling groups_inventory to load vars for managed-node2 40074 1727204621.90511: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.90522: done sending task result for task 12b410aa-8751-9fd7-2501-000000000483 40074 1727204621.90525: WORKER PROCESS EXITING 40074 1727204621.90543: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.90547: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.90551: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.91004: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204621.91397: done with get_vars() 40074 1727204621.91417: done getting variables 40074 1727204621.91487: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204621.91648: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'ethtest1'] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.387) 0:00:15.678 ***** 40074 1727204621.91681: entering _queue_task() for managed-node2/assert 40074 1727204621.92099: worker is 1 (out of 1 available) 40074 1727204621.92114: exiting _queue_task() for managed-node2/assert 40074 1727204621.92129: done queuing things up, now waiting for results queue to drain 40074 1727204621.92131: waiting for pending results... 40074 1727204621.92355: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'ethtest1' 40074 1727204621.92535: in run() - task 12b410aa-8751-9fd7-2501-0000000003ec 40074 1727204621.92539: variable 'ansible_search_path' from source: unknown 40074 1727204621.92542: variable 'ansible_search_path' from source: unknown 40074 1727204621.92569: calling self._execute() 40074 1727204621.92685: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.92702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.92752: variable 'omit' from source: magic vars 40074 1727204621.93195: variable 'ansible_distribution_major_version' from source: facts 40074 1727204621.93215: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204621.93229: variable 'omit' from source: magic vars 40074 1727204621.93293: variable 'omit' from source: magic vars 40074 1727204621.93485: variable 'interface' from source: set_fact 40074 1727204621.93488: variable 'omit' from source: magic vars 40074 1727204621.93506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204621.93559: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204621.93591: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204621.93630: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204621.93649: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204621.93704: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204621.93721: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.93735: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.93895: Set connection var ansible_pipelining to False 40074 1727204621.93899: Set connection var ansible_shell_executable to /bin/sh 40074 1727204621.93904: Set connection var ansible_shell_type to sh 40074 1727204621.93928: Set connection var ansible_connection to ssh 40074 1727204621.93939: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204621.93994: Set connection var ansible_timeout to 10 40074 1727204621.93997: variable 'ansible_shell_executable' from source: unknown 40074 1727204621.94000: variable 'ansible_connection' from source: unknown 40074 1727204621.94006: variable 'ansible_module_compression' from source: unknown 40074 1727204621.94014: variable 'ansible_shell_type' from source: unknown 40074 1727204621.94029: variable 'ansible_shell_executable' from source: unknown 40074 1727204621.94041: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.94051: variable 'ansible_pipelining' from source: unknown 40074 1727204621.94063: variable 'ansible_timeout' from source: unknown 40074 1727204621.94095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.94264: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204621.94281: variable 'omit' from source: magic vars 40074 1727204621.94494: starting attempt loop 40074 1727204621.94497: running the handler 40074 1727204621.94500: variable 'interface_stat' from source: set_fact 40074 1727204621.94523: Evaluated conditional (interface_stat.stat.exists): True 40074 1727204621.94527: handler run complete 40074 1727204621.94546: attempt loop complete, returning result 40074 1727204621.94549: _execute() done 40074 1727204621.94552: dumping result to json 40074 1727204621.94558: done dumping result, returning 40074 1727204621.94566: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'ethtest1' [12b410aa-8751-9fd7-2501-0000000003ec] 40074 1727204621.94578: sending task result for task 12b410aa-8751-9fd7-2501-0000000003ec 40074 1727204621.94685: done sending task result for task 12b410aa-8751-9fd7-2501-0000000003ec 40074 1727204621.94687: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 40074 1727204621.94761: no more pending results, returning what we have 40074 1727204621.94765: results queue empty 40074 1727204621.94766: checking for any_errors_fatal 40074 1727204621.94776: done checking for any_errors_fatal 40074 1727204621.94777: checking for max_fail_percentage 40074 1727204621.94779: done checking for max_fail_percentage 40074 1727204621.94780: checking to see if all hosts have failed and the running result is not ok 40074 1727204621.94782: done checking to see if all hosts have failed 40074 1727204621.94821: getting the remaining hosts for this loop 40074 1727204621.94823: done getting the remaining hosts for this loop 40074 1727204621.94829: getting the next task for host managed-node2 40074 1727204621.94841: done getting next task for host managed-node2 40074 1727204621.94848: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 40074 1727204621.94851: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204621.94871: getting variables 40074 1727204621.94873: in VariableManager get_vars() 40074 1727204621.95068: Calling all_inventory to load vars for managed-node2 40074 1727204621.95071: Calling groups_inventory to load vars for managed-node2 40074 1727204621.95074: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.95084: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.95088: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.95096: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.95515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204621.95917: done with get_vars() 40074 1727204621.95937: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:41 -0400 (0:00:00.044) 0:00:15.723 ***** 40074 1727204621.96173: entering _queue_task() for managed-node2/include_tasks 40074 1727204621.96470: worker is 1 (out of 1 available) 40074 1727204621.96498: exiting _queue_task() for managed-node2/include_tasks 40074 1727204621.96512: done queuing things up, now waiting for results queue to drain 40074 1727204621.96514: waiting for pending results... 40074 1727204621.96685: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 40074 1727204621.96788: in run() - task 12b410aa-8751-9fd7-2501-00000000001b 40074 1727204621.96803: variable 'ansible_search_path' from source: unknown 40074 1727204621.96807: variable 'ansible_search_path' from source: unknown 40074 1727204621.96839: calling self._execute() 40074 1727204621.96912: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204621.96923: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204621.96930: variable 'omit' from source: magic vars 40074 1727204621.97233: variable 'ansible_distribution_major_version' from source: facts 40074 1727204621.97244: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204621.97251: _execute() done 40074 1727204621.97256: dumping result to json 40074 1727204621.97259: done dumping result, returning 40074 1727204621.97267: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-9fd7-2501-00000000001b] 40074 1727204621.97272: sending task result for task 12b410aa-8751-9fd7-2501-00000000001b 40074 1727204621.97369: done sending task result for task 12b410aa-8751-9fd7-2501-00000000001b 40074 1727204621.97372: WORKER PROCESS EXITING 40074 1727204621.97433: no more pending results, returning what we have 40074 1727204621.97437: in VariableManager get_vars() 40074 1727204621.97479: Calling all_inventory to load vars for managed-node2 40074 1727204621.97482: Calling groups_inventory to load vars for managed-node2 40074 1727204621.97485: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.97497: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.97500: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.97504: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.97673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204621.97874: done with get_vars() 40074 1727204621.97881: variable 'ansible_search_path' from source: unknown 40074 1727204621.97882: variable 'ansible_search_path' from source: unknown 40074 1727204621.97914: we have included files to process 40074 1727204621.97915: generating all_blocks data 40074 1727204621.97916: done generating all_blocks data 40074 1727204621.97922: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 40074 1727204621.97922: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 40074 1727204621.97924: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 40074 1727204621.98757: done processing included file 40074 1727204621.98760: iterating over new_blocks loaded from include file 40074 1727204621.98761: in VariableManager get_vars() 40074 1727204621.98795: done with get_vars() 40074 1727204621.98797: filtering new block on tags 40074 1727204621.98820: done filtering new block on tags 40074 1727204621.98824: in VariableManager get_vars() 40074 1727204621.98854: done with get_vars() 40074 1727204621.98856: filtering new block on tags 40074 1727204621.98884: done filtering new block on tags 40074 1727204621.98887: in VariableManager get_vars() 40074 1727204621.98920: done with get_vars() 40074 1727204621.98922: filtering new block on tags 40074 1727204621.98947: done filtering new block on tags 40074 1727204621.98949: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 40074 1727204621.98956: extending task lists for all hosts with included blocks 40074 1727204621.99770: done extending task lists 40074 1727204621.99772: done processing included files 40074 1727204621.99772: results queue empty 40074 1727204621.99773: checking for any_errors_fatal 40074 1727204621.99775: done checking for any_errors_fatal 40074 1727204621.99776: checking for max_fail_percentage 40074 1727204621.99777: done checking for max_fail_percentage 40074 1727204621.99777: checking to see if all hosts have failed and the running result is not ok 40074 1727204621.99778: done checking to see if all hosts have failed 40074 1727204621.99779: getting the remaining hosts for this loop 40074 1727204621.99780: done getting the remaining hosts for this loop 40074 1727204621.99781: getting the next task for host managed-node2 40074 1727204621.99785: done getting next task for host managed-node2 40074 1727204621.99786: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 40074 1727204621.99791: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204621.99800: getting variables 40074 1727204621.99801: in VariableManager get_vars() 40074 1727204621.99814: Calling all_inventory to load vars for managed-node2 40074 1727204621.99815: Calling groups_inventory to load vars for managed-node2 40074 1727204621.99819: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204621.99823: Calling all_plugins_play to load vars for managed-node2 40074 1727204621.99825: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204621.99828: Calling groups_plugins_play to load vars for managed-node2 40074 1727204621.99977: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204622.00179: done with get_vars() 40074 1727204622.00187: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:42 -0400 (0:00:00.040) 0:00:15.764 ***** 40074 1727204622.00249: entering _queue_task() for managed-node2/setup 40074 1727204622.00470: worker is 1 (out of 1 available) 40074 1727204622.00486: exiting _queue_task() for managed-node2/setup 40074 1727204622.00500: done queuing things up, now waiting for results queue to drain 40074 1727204622.00502: waiting for pending results... 40074 1727204622.00674: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 40074 1727204622.00782: in run() - task 12b410aa-8751-9fd7-2501-00000000049b 40074 1727204622.00797: variable 'ansible_search_path' from source: unknown 40074 1727204622.00801: variable 'ansible_search_path' from source: unknown 40074 1727204622.00834: calling self._execute() 40074 1727204622.00905: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204622.00912: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204622.00954: variable 'omit' from source: magic vars 40074 1727204622.01296: variable 'ansible_distribution_major_version' from source: facts 40074 1727204622.01300: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204622.01599: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204622.03654: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204622.03982: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204622.04016: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204622.04048: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204622.04072: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204622.04143: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204622.04170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204622.04193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204622.04228: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204622.04241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204622.04291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204622.04312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204622.04335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204622.04367: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204622.04385: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204622.04516: variable '__network_required_facts' from source: role '' defaults 40074 1727204622.04551: variable 'ansible_facts' from source: unknown 40074 1727204622.04923: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 40074 1727204622.04927: when evaluation is False, skipping this task 40074 1727204622.04929: _execute() done 40074 1727204622.04931: dumping result to json 40074 1727204622.04933: done dumping result, returning 40074 1727204622.04936: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-9fd7-2501-00000000049b] 40074 1727204622.04938: sending task result for task 12b410aa-8751-9fd7-2501-00000000049b 40074 1727204622.05009: done sending task result for task 12b410aa-8751-9fd7-2501-00000000049b 40074 1727204622.05012: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204622.05068: no more pending results, returning what we have 40074 1727204622.05071: results queue empty 40074 1727204622.05072: checking for any_errors_fatal 40074 1727204622.05074: done checking for any_errors_fatal 40074 1727204622.05075: checking for max_fail_percentage 40074 1727204622.05077: done checking for max_fail_percentage 40074 1727204622.05077: checking to see if all hosts have failed and the running result is not ok 40074 1727204622.05079: done checking to see if all hosts have failed 40074 1727204622.05080: getting the remaining hosts for this loop 40074 1727204622.05081: done getting the remaining hosts for this loop 40074 1727204622.05085: getting the next task for host managed-node2 40074 1727204622.05096: done getting next task for host managed-node2 40074 1727204622.05100: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 40074 1727204622.05105: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204622.05120: getting variables 40074 1727204622.05129: in VariableManager get_vars() 40074 1727204622.05172: Calling all_inventory to load vars for managed-node2 40074 1727204622.05175: Calling groups_inventory to load vars for managed-node2 40074 1727204622.05178: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204622.05191: Calling all_plugins_play to load vars for managed-node2 40074 1727204622.05195: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204622.05198: Calling groups_plugins_play to load vars for managed-node2 40074 1727204622.05493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204622.05884: done with get_vars() 40074 1727204622.05908: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:42 -0400 (0:00:00.057) 0:00:15.822 ***** 40074 1727204622.06035: entering _queue_task() for managed-node2/stat 40074 1727204622.06352: worker is 1 (out of 1 available) 40074 1727204622.06369: exiting _queue_task() for managed-node2/stat 40074 1727204622.06382: done queuing things up, now waiting for results queue to drain 40074 1727204622.06384: waiting for pending results... 40074 1727204622.06707: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 40074 1727204622.06780: in run() - task 12b410aa-8751-9fd7-2501-00000000049d 40074 1727204622.06807: variable 'ansible_search_path' from source: unknown 40074 1727204622.06821: variable 'ansible_search_path' from source: unknown 40074 1727204622.06867: calling self._execute() 40074 1727204622.06969: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204622.06986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204622.07009: variable 'omit' from source: magic vars 40074 1727204622.07594: variable 'ansible_distribution_major_version' from source: facts 40074 1727204622.07597: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204622.07768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204622.08001: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204622.08043: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204622.08073: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204622.08106: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204622.08181: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204622.08208: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204622.08233: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204622.08256: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204622.08334: variable '__network_is_ostree' from source: set_fact 40074 1727204622.08341: Evaluated conditional (not __network_is_ostree is defined): False 40074 1727204622.08346: when evaluation is False, skipping this task 40074 1727204622.08349: _execute() done 40074 1727204622.08355: dumping result to json 40074 1727204622.08357: done dumping result, returning 40074 1727204622.08366: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-9fd7-2501-00000000049d] 40074 1727204622.08371: sending task result for task 12b410aa-8751-9fd7-2501-00000000049d 40074 1727204622.08468: done sending task result for task 12b410aa-8751-9fd7-2501-00000000049d 40074 1727204622.08471: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 40074 1727204622.08533: no more pending results, returning what we have 40074 1727204622.08537: results queue empty 40074 1727204622.08538: checking for any_errors_fatal 40074 1727204622.08546: done checking for any_errors_fatal 40074 1727204622.08547: checking for max_fail_percentage 40074 1727204622.08548: done checking for max_fail_percentage 40074 1727204622.08549: checking to see if all hosts have failed and the running result is not ok 40074 1727204622.08551: done checking to see if all hosts have failed 40074 1727204622.08551: getting the remaining hosts for this loop 40074 1727204622.08553: done getting the remaining hosts for this loop 40074 1727204622.08557: getting the next task for host managed-node2 40074 1727204622.08563: done getting next task for host managed-node2 40074 1727204622.08568: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 40074 1727204622.08572: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204622.08597: getting variables 40074 1727204622.08599: in VariableManager get_vars() 40074 1727204622.08639: Calling all_inventory to load vars for managed-node2 40074 1727204622.08642: Calling groups_inventory to load vars for managed-node2 40074 1727204622.08644: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204622.08654: Calling all_plugins_play to load vars for managed-node2 40074 1727204622.08658: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204622.08661: Calling groups_plugins_play to load vars for managed-node2 40074 1727204622.08873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204622.09078: done with get_vars() 40074 1727204622.09087: done getting variables 40074 1727204622.09139: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:42 -0400 (0:00:00.031) 0:00:15.853 ***** 40074 1727204622.09168: entering _queue_task() for managed-node2/set_fact 40074 1727204622.09385: worker is 1 (out of 1 available) 40074 1727204622.09402: exiting _queue_task() for managed-node2/set_fact 40074 1727204622.09416: done queuing things up, now waiting for results queue to drain 40074 1727204622.09417: waiting for pending results... 40074 1727204622.09707: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 40074 1727204622.09870: in run() - task 12b410aa-8751-9fd7-2501-00000000049e 40074 1727204622.09875: variable 'ansible_search_path' from source: unknown 40074 1727204622.09877: variable 'ansible_search_path' from source: unknown 40074 1727204622.09881: calling self._execute() 40074 1727204622.10097: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204622.10101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204622.10104: variable 'omit' from source: magic vars 40074 1727204622.10470: variable 'ansible_distribution_major_version' from source: facts 40074 1727204622.10493: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204622.10744: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204622.10998: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204622.11039: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204622.11070: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204622.11132: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204622.11210: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204622.11234: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204622.11256: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204622.11277: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204622.11357: variable '__network_is_ostree' from source: set_fact 40074 1727204622.11365: Evaluated conditional (not __network_is_ostree is defined): False 40074 1727204622.11368: when evaluation is False, skipping this task 40074 1727204622.11372: _execute() done 40074 1727204622.11377: dumping result to json 40074 1727204622.11381: done dumping result, returning 40074 1727204622.11391: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-9fd7-2501-00000000049e] 40074 1727204622.11397: sending task result for task 12b410aa-8751-9fd7-2501-00000000049e 40074 1727204622.11493: done sending task result for task 12b410aa-8751-9fd7-2501-00000000049e 40074 1727204622.11496: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 40074 1727204622.11549: no more pending results, returning what we have 40074 1727204622.11553: results queue empty 40074 1727204622.11554: checking for any_errors_fatal 40074 1727204622.11561: done checking for any_errors_fatal 40074 1727204622.11562: checking for max_fail_percentage 40074 1727204622.11563: done checking for max_fail_percentage 40074 1727204622.11564: checking to see if all hosts have failed and the running result is not ok 40074 1727204622.11565: done checking to see if all hosts have failed 40074 1727204622.11566: getting the remaining hosts for this loop 40074 1727204622.11568: done getting the remaining hosts for this loop 40074 1727204622.11572: getting the next task for host managed-node2 40074 1727204622.11582: done getting next task for host managed-node2 40074 1727204622.11587: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 40074 1727204622.11593: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204622.11610: getting variables 40074 1727204622.11612: in VariableManager get_vars() 40074 1727204622.11653: Calling all_inventory to load vars for managed-node2 40074 1727204622.11656: Calling groups_inventory to load vars for managed-node2 40074 1727204622.11659: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204622.11669: Calling all_plugins_play to load vars for managed-node2 40074 1727204622.11672: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204622.11675: Calling groups_plugins_play to load vars for managed-node2 40074 1727204622.11865: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204622.12100: done with get_vars() 40074 1727204622.12109: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:42 -0400 (0:00:00.030) 0:00:15.883 ***** 40074 1727204622.12188: entering _queue_task() for managed-node2/service_facts 40074 1727204622.12192: Creating lock for service_facts 40074 1727204622.12425: worker is 1 (out of 1 available) 40074 1727204622.12439: exiting _queue_task() for managed-node2/service_facts 40074 1727204622.12451: done queuing things up, now waiting for results queue to drain 40074 1727204622.12453: waiting for pending results... 40074 1727204622.12642: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 40074 1727204622.12754: in run() - task 12b410aa-8751-9fd7-2501-0000000004a0 40074 1727204622.12767: variable 'ansible_search_path' from source: unknown 40074 1727204622.12771: variable 'ansible_search_path' from source: unknown 40074 1727204622.12806: calling self._execute() 40074 1727204622.12881: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204622.12887: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204622.12904: variable 'omit' from source: magic vars 40074 1727204622.13211: variable 'ansible_distribution_major_version' from source: facts 40074 1727204622.13227: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204622.13232: variable 'omit' from source: magic vars 40074 1727204622.13299: variable 'omit' from source: magic vars 40074 1727204622.13336: variable 'omit' from source: magic vars 40074 1727204622.13369: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204622.13402: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204622.13420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204622.13445: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204622.13456: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204622.13482: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204622.13485: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204622.13492: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204622.13583: Set connection var ansible_pipelining to False 40074 1727204622.13592: Set connection var ansible_shell_executable to /bin/sh 40074 1727204622.13595: Set connection var ansible_shell_type to sh 40074 1727204622.13598: Set connection var ansible_connection to ssh 40074 1727204622.13605: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204622.13611: Set connection var ansible_timeout to 10 40074 1727204622.13637: variable 'ansible_shell_executable' from source: unknown 40074 1727204622.13640: variable 'ansible_connection' from source: unknown 40074 1727204622.13643: variable 'ansible_module_compression' from source: unknown 40074 1727204622.13646: variable 'ansible_shell_type' from source: unknown 40074 1727204622.13649: variable 'ansible_shell_executable' from source: unknown 40074 1727204622.13659: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204622.13663: variable 'ansible_pipelining' from source: unknown 40074 1727204622.13665: variable 'ansible_timeout' from source: unknown 40074 1727204622.13669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204622.13842: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204622.13852: variable 'omit' from source: magic vars 40074 1727204622.13858: starting attempt loop 40074 1727204622.13861: running the handler 40074 1727204622.13880: _low_level_execute_command(): starting 40074 1727204622.13884: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204622.14438: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204622.14443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 40074 1727204622.14447: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204622.14449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204622.14495: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204622.14512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204622.14515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204622.14554: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204622.16346: stdout chunk (state=3): >>>/root <<< 40074 1727204622.16455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204622.16515: stderr chunk (state=3): >>><<< 40074 1727204622.16521: stdout chunk (state=3): >>><<< 40074 1727204622.16541: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204622.16554: _low_level_execute_command(): starting 40074 1727204622.16561: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580 `" && echo ansible-tmp-1727204622.1654158-40841-267692779556580="` echo /root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580 `" ) && sleep 0' 40074 1727204622.17044: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204622.17049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204622.17051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204622.17061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204622.17063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204622.17110: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204622.17113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204622.17161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204622.19193: stdout chunk (state=3): >>>ansible-tmp-1727204622.1654158-40841-267692779556580=/root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580 <<< 40074 1727204622.19313: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204622.19366: stderr chunk (state=3): >>><<< 40074 1727204622.19370: stdout chunk (state=3): >>><<< 40074 1727204622.19384: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204622.1654158-40841-267692779556580=/root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204622.19432: variable 'ansible_module_compression' from source: unknown 40074 1727204622.19474: ANSIBALLZ: Using lock for service_facts 40074 1727204622.19478: ANSIBALLZ: Acquiring lock 40074 1727204622.19480: ANSIBALLZ: Lock acquired: 139809959246880 40074 1727204622.19483: ANSIBALLZ: Creating module 40074 1727204622.31255: ANSIBALLZ: Writing module into payload 40074 1727204622.31343: ANSIBALLZ: Writing module 40074 1727204622.31362: ANSIBALLZ: Renaming module 40074 1727204622.31368: ANSIBALLZ: Done creating module 40074 1727204622.31384: variable 'ansible_facts' from source: unknown 40074 1727204622.31438: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580/AnsiballZ_service_facts.py 40074 1727204622.31554: Sending initial data 40074 1727204622.31558: Sent initial data (162 bytes) 40074 1727204622.32053: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204622.32057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204622.32060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204622.32062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204622.32126: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204622.32137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204622.32139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204622.32177: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204622.33908: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 40074 1727204622.33913: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204622.33943: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204622.33988: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpgkq9phzv /root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580/AnsiballZ_service_facts.py <<< 40074 1727204622.33992: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580/AnsiballZ_service_facts.py" <<< 40074 1727204622.34023: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpgkq9phzv" to remote "/root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580/AnsiballZ_service_facts.py" <<< 40074 1727204622.34031: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580/AnsiballZ_service_facts.py" <<< 40074 1727204622.34821: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204622.34902: stderr chunk (state=3): >>><<< 40074 1727204622.34906: stdout chunk (state=3): >>><<< 40074 1727204622.34927: done transferring module to remote 40074 1727204622.34938: _low_level_execute_command(): starting 40074 1727204622.34944: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580/ /root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580/AnsiballZ_service_facts.py && sleep 0' 40074 1727204622.35433: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204622.35436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204622.35439: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204622.35445: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204622.35496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204622.35499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204622.35544: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204622.37460: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204622.37515: stderr chunk (state=3): >>><<< 40074 1727204622.37521: stdout chunk (state=3): >>><<< 40074 1727204622.37534: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204622.37537: _low_level_execute_command(): starting 40074 1727204622.37543: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580/AnsiballZ_service_facts.py && sleep 0' 40074 1727204622.38027: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204622.38030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204622.38033: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204622.38035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204622.38086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204622.38094: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204622.38139: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204624.40601: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 40074 1727204624.40659: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zr<<< 40074 1727204624.40675: stdout chunk (state=3): >>>am0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": <<< 40074 1727204624.40707: stdout chunk (state=3): >>>"inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 40074 1727204624.42391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204624.42497: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 40074 1727204624.42517: stderr chunk (state=3): >>><<< 40074 1727204624.42527: stdout chunk (state=3): >>><<< 40074 1727204624.42557: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204624.43608: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204624.43683: _low_level_execute_command(): starting 40074 1727204624.43686: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204622.1654158-40841-267692779556580/ > /dev/null 2>&1 && sleep 0' 40074 1727204624.44270: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204624.44284: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204624.44346: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204624.44410: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204624.44431: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204624.44469: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204624.44534: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204624.46599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204624.46603: stdout chunk (state=3): >>><<< 40074 1727204624.46606: stderr chunk (state=3): >>><<< 40074 1727204624.46608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204624.46611: handler run complete 40074 1727204624.47000: variable 'ansible_facts' from source: unknown 40074 1727204624.47667: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204624.49454: variable 'ansible_facts' from source: unknown 40074 1727204624.49746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204624.50387: attempt loop complete, returning result 40074 1727204624.50413: _execute() done 40074 1727204624.50421: dumping result to json 40074 1727204624.50713: done dumping result, returning 40074 1727204624.50735: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-9fd7-2501-0000000004a0] 40074 1727204624.50745: sending task result for task 12b410aa-8751-9fd7-2501-0000000004a0 40074 1727204624.52627: done sending task result for task 12b410aa-8751-9fd7-2501-0000000004a0 40074 1727204624.52630: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204624.52734: no more pending results, returning what we have 40074 1727204624.52737: results queue empty 40074 1727204624.52738: checking for any_errors_fatal 40074 1727204624.52742: done checking for any_errors_fatal 40074 1727204624.52743: checking for max_fail_percentage 40074 1727204624.52745: done checking for max_fail_percentage 40074 1727204624.52746: checking to see if all hosts have failed and the running result is not ok 40074 1727204624.52747: done checking to see if all hosts have failed 40074 1727204624.52748: getting the remaining hosts for this loop 40074 1727204624.52750: done getting the remaining hosts for this loop 40074 1727204624.52754: getting the next task for host managed-node2 40074 1727204624.52759: done getting next task for host managed-node2 40074 1727204624.52763: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 40074 1727204624.52773: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204624.52785: getting variables 40074 1727204624.52787: in VariableManager get_vars() 40074 1727204624.52826: Calling all_inventory to load vars for managed-node2 40074 1727204624.52830: Calling groups_inventory to load vars for managed-node2 40074 1727204624.52833: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204624.52843: Calling all_plugins_play to load vars for managed-node2 40074 1727204624.52847: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204624.52851: Calling groups_plugins_play to load vars for managed-node2 40074 1727204624.54361: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204624.56096: done with get_vars() 40074 1727204624.56231: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:44 -0400 (0:00:02.441) 0:00:18.325 ***** 40074 1727204624.56365: entering _queue_task() for managed-node2/package_facts 40074 1727204624.56367: Creating lock for package_facts 40074 1727204624.56717: worker is 1 (out of 1 available) 40074 1727204624.56729: exiting _queue_task() for managed-node2/package_facts 40074 1727204624.56741: done queuing things up, now waiting for results queue to drain 40074 1727204624.56743: waiting for pending results... 40074 1727204624.57104: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 40074 1727204624.57237: in run() - task 12b410aa-8751-9fd7-2501-0000000004a1 40074 1727204624.57257: variable 'ansible_search_path' from source: unknown 40074 1727204624.57266: variable 'ansible_search_path' from source: unknown 40074 1727204624.57316: calling self._execute() 40074 1727204624.57422: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204624.57436: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204624.57452: variable 'omit' from source: magic vars 40074 1727204624.57901: variable 'ansible_distribution_major_version' from source: facts 40074 1727204624.57962: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204624.57966: variable 'omit' from source: magic vars 40074 1727204624.58032: variable 'omit' from source: magic vars 40074 1727204624.58084: variable 'omit' from source: magic vars 40074 1727204624.58133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204624.58185: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204624.58214: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204624.58241: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204624.58289: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204624.58307: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204624.58316: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204624.58324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204624.58462: Set connection var ansible_pipelining to False 40074 1727204624.58509: Set connection var ansible_shell_executable to /bin/sh 40074 1727204624.58512: Set connection var ansible_shell_type to sh 40074 1727204624.58514: Set connection var ansible_connection to ssh 40074 1727204624.58517: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204624.58519: Set connection var ansible_timeout to 10 40074 1727204624.58547: variable 'ansible_shell_executable' from source: unknown 40074 1727204624.58555: variable 'ansible_connection' from source: unknown 40074 1727204624.58562: variable 'ansible_module_compression' from source: unknown 40074 1727204624.58593: variable 'ansible_shell_type' from source: unknown 40074 1727204624.58596: variable 'ansible_shell_executable' from source: unknown 40074 1727204624.58598: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204624.58601: variable 'ansible_pipelining' from source: unknown 40074 1727204624.58603: variable 'ansible_timeout' from source: unknown 40074 1727204624.58605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204624.58849: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204624.58945: variable 'omit' from source: magic vars 40074 1727204624.58948: starting attempt loop 40074 1727204624.58951: running the handler 40074 1727204624.58953: _low_level_execute_command(): starting 40074 1727204624.58956: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204624.59677: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204624.59699: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204624.59725: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204624.59837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204624.59865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204624.59885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204624.59913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204624.60004: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204624.61861: stdout chunk (state=3): >>>/root <<< 40074 1727204624.62062: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204624.62066: stdout chunk (state=3): >>><<< 40074 1727204624.62068: stderr chunk (state=3): >>><<< 40074 1727204624.62094: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204624.62113: _low_level_execute_command(): starting 40074 1727204624.62125: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918 `" && echo ansible-tmp-1727204624.6210017-40895-5966244791918="` echo /root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918 `" ) && sleep 0' 40074 1727204624.62755: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204624.62770: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204624.62784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204624.62806: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204624.62824: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204624.62857: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204624.62966: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204624.62986: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204624.63068: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204624.65176: stdout chunk (state=3): >>>ansible-tmp-1727204624.6210017-40895-5966244791918=/root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918 <<< 40074 1727204624.65364: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204624.65368: stdout chunk (state=3): >>><<< 40074 1727204624.65376: stderr chunk (state=3): >>><<< 40074 1727204624.65501: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204624.6210017-40895-5966244791918=/root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204624.65505: variable 'ansible_module_compression' from source: unknown 40074 1727204624.65520: ANSIBALLZ: Using lock for package_facts 40074 1727204624.65527: ANSIBALLZ: Acquiring lock 40074 1727204624.65530: ANSIBALLZ: Lock acquired: 139809969603840 40074 1727204624.65537: ANSIBALLZ: Creating module 40074 1727204625.10503: ANSIBALLZ: Writing module into payload 40074 1727204625.10627: ANSIBALLZ: Writing module 40074 1727204625.10655: ANSIBALLZ: Renaming module 40074 1727204625.10664: ANSIBALLZ: Done creating module 40074 1727204625.10682: variable 'ansible_facts' from source: unknown 40074 1727204625.10810: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918/AnsiballZ_package_facts.py 40074 1727204625.10941: Sending initial data 40074 1727204625.10944: Sent initial data (160 bytes) 40074 1727204625.11407: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204625.11411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204625.11508: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204625.11544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204625.11620: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204625.13407: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204625.13443: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204625.13477: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpd9pm3h7e /root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918/AnsiballZ_package_facts.py <<< 40074 1727204625.13486: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918/AnsiballZ_package_facts.py" <<< 40074 1727204625.13513: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpd9pm3h7e" to remote "/root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918/AnsiballZ_package_facts.py" <<< 40074 1727204625.13522: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918/AnsiballZ_package_facts.py" <<< 40074 1727204625.15496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204625.15500: stderr chunk (state=3): >>><<< 40074 1727204625.15502: stdout chunk (state=3): >>><<< 40074 1727204625.15520: done transferring module to remote 40074 1727204625.15537: _low_level_execute_command(): starting 40074 1727204625.15543: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918/ /root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918/AnsiballZ_package_facts.py && sleep 0' 40074 1727204625.16167: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204625.16171: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204625.16196: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204625.16200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204625.16216: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204625.16278: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204625.16284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204625.16286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204625.16325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204625.18277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204625.18325: stderr chunk (state=3): >>><<< 40074 1727204625.18329: stdout chunk (state=3): >>><<< 40074 1727204625.18344: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204625.18347: _low_level_execute_command(): starting 40074 1727204625.18354: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918/AnsiballZ_package_facts.py && sleep 0' 40074 1727204625.18821: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204625.18825: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204625.18827: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204625.18829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204625.18883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204625.18886: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204625.18941: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204625.83551: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 40074 1727204625.83587: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": <<< 40074 1727204625.83615: stdout chunk (state=3): >>>"rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, <<< 40074 1727204625.83643: stdout chunk (state=3): >>>"arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 40074 1727204625.83684: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 40074 1727204625.83694: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 40074 1727204625.83707: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5<<< 40074 1727204625.83720: stdout chunk (state=3): >>>", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 40074 1727204625.83744: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name"<<< 40074 1727204625.83753: stdout chunk (state=3): >>>: "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 40074 1727204625.83785: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 40074 1727204625.83800: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 40074 1727204625.85761: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204625.85828: stderr chunk (state=3): >>><<< 40074 1727204625.85831: stdout chunk (state=3): >>><<< 40074 1727204625.85873: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204625.88996: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204625.89013: _low_level_execute_command(): starting 40074 1727204625.89021: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204624.6210017-40895-5966244791918/ > /dev/null 2>&1 && sleep 0' 40074 1727204625.89471: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204625.89476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204625.89510: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204625.89514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204625.89516: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204625.89521: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204625.89577: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204625.89581: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204625.89635: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204625.91796: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204625.91799: stdout chunk (state=3): >>><<< 40074 1727204625.91804: stderr chunk (state=3): >>><<< 40074 1727204625.91806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204625.91809: handler run complete 40074 1727204625.93208: variable 'ansible_facts' from source: unknown 40074 1727204625.93960: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204625.97667: variable 'ansible_facts' from source: unknown 40074 1727204625.98424: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204625.99855: attempt loop complete, returning result 40074 1727204625.99878: _execute() done 40074 1727204625.99882: dumping result to json 40074 1727204626.00194: done dumping result, returning 40074 1727204626.00206: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-9fd7-2501-0000000004a1] 40074 1727204626.00210: sending task result for task 12b410aa-8751-9fd7-2501-0000000004a1 40074 1727204626.04021: done sending task result for task 12b410aa-8751-9fd7-2501-0000000004a1 40074 1727204626.04025: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204626.04132: no more pending results, returning what we have 40074 1727204626.04136: results queue empty 40074 1727204626.04137: checking for any_errors_fatal 40074 1727204626.04146: done checking for any_errors_fatal 40074 1727204626.04147: checking for max_fail_percentage 40074 1727204626.04149: done checking for max_fail_percentage 40074 1727204626.04150: checking to see if all hosts have failed and the running result is not ok 40074 1727204626.04151: done checking to see if all hosts have failed 40074 1727204626.04152: getting the remaining hosts for this loop 40074 1727204626.04153: done getting the remaining hosts for this loop 40074 1727204626.04158: getting the next task for host managed-node2 40074 1727204626.04165: done getting next task for host managed-node2 40074 1727204626.04169: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 40074 1727204626.04173: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204626.04183: getting variables 40074 1727204626.04185: in VariableManager get_vars() 40074 1727204626.04234: Calling all_inventory to load vars for managed-node2 40074 1727204626.04237: Calling groups_inventory to load vars for managed-node2 40074 1727204626.04240: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204626.04251: Calling all_plugins_play to load vars for managed-node2 40074 1727204626.04255: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204626.04258: Calling groups_plugins_play to load vars for managed-node2 40074 1727204626.06498: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204626.09666: done with get_vars() 40074 1727204626.09710: done getting variables 40074 1727204626.09796: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:46 -0400 (0:00:01.534) 0:00:19.860 ***** 40074 1727204626.09845: entering _queue_task() for managed-node2/debug 40074 1727204626.10332: worker is 1 (out of 1 available) 40074 1727204626.10346: exiting _queue_task() for managed-node2/debug 40074 1727204626.10357: done queuing things up, now waiting for results queue to drain 40074 1727204626.10359: waiting for pending results... 40074 1727204626.10712: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 40074 1727204626.10778: in run() - task 12b410aa-8751-9fd7-2501-00000000001c 40074 1727204626.10815: variable 'ansible_search_path' from source: unknown 40074 1727204626.10829: variable 'ansible_search_path' from source: unknown 40074 1727204626.10874: calling self._execute() 40074 1727204626.10988: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204626.11006: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204626.11034: variable 'omit' from source: magic vars 40074 1727204626.11525: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.11545: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204626.11559: variable 'omit' from source: magic vars 40074 1727204626.11643: variable 'omit' from source: magic vars 40074 1727204626.11777: variable 'network_provider' from source: set_fact 40074 1727204626.11814: variable 'omit' from source: magic vars 40074 1727204626.11897: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204626.11923: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204626.11956: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204626.11984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204626.12192: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204626.12197: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204626.12201: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204626.12204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204626.12215: Set connection var ansible_pipelining to False 40074 1727204626.12234: Set connection var ansible_shell_executable to /bin/sh 40074 1727204626.12242: Set connection var ansible_shell_type to sh 40074 1727204626.12251: Set connection var ansible_connection to ssh 40074 1727204626.12265: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204626.12278: Set connection var ansible_timeout to 10 40074 1727204626.12323: variable 'ansible_shell_executable' from source: unknown 40074 1727204626.12337: variable 'ansible_connection' from source: unknown 40074 1727204626.12347: variable 'ansible_module_compression' from source: unknown 40074 1727204626.12355: variable 'ansible_shell_type' from source: unknown 40074 1727204626.12363: variable 'ansible_shell_executable' from source: unknown 40074 1727204626.12371: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204626.12380: variable 'ansible_pipelining' from source: unknown 40074 1727204626.12390: variable 'ansible_timeout' from source: unknown 40074 1727204626.12400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204626.12594: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204626.12614: variable 'omit' from source: magic vars 40074 1727204626.12629: starting attempt loop 40074 1727204626.12651: running the handler 40074 1727204626.12760: handler run complete 40074 1727204626.12767: attempt loop complete, returning result 40074 1727204626.12773: _execute() done 40074 1727204626.12775: dumping result to json 40074 1727204626.12778: done dumping result, returning 40074 1727204626.12780: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-9fd7-2501-00000000001c] 40074 1727204626.12783: sending task result for task 12b410aa-8751-9fd7-2501-00000000001c ok: [managed-node2] => {} MSG: Using network provider: nm 40074 1727204626.13075: no more pending results, returning what we have 40074 1727204626.13079: results queue empty 40074 1727204626.13080: checking for any_errors_fatal 40074 1727204626.13094: done checking for any_errors_fatal 40074 1727204626.13096: checking for max_fail_percentage 40074 1727204626.13098: done checking for max_fail_percentage 40074 1727204626.13099: checking to see if all hosts have failed and the running result is not ok 40074 1727204626.13100: done checking to see if all hosts have failed 40074 1727204626.13101: getting the remaining hosts for this loop 40074 1727204626.13103: done getting the remaining hosts for this loop 40074 1727204626.13108: getting the next task for host managed-node2 40074 1727204626.13119: done getting next task for host managed-node2 40074 1727204626.13125: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 40074 1727204626.13129: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204626.13146: getting variables 40074 1727204626.13149: in VariableManager get_vars() 40074 1727204626.13316: Calling all_inventory to load vars for managed-node2 40074 1727204626.13323: Calling groups_inventory to load vars for managed-node2 40074 1727204626.13326: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204626.13334: done sending task result for task 12b410aa-8751-9fd7-2501-00000000001c 40074 1727204626.13337: WORKER PROCESS EXITING 40074 1727204626.13347: Calling all_plugins_play to load vars for managed-node2 40074 1727204626.13350: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204626.13354: Calling groups_plugins_play to load vars for managed-node2 40074 1727204626.15862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204626.18354: done with get_vars() 40074 1727204626.18380: done getting variables 40074 1727204626.18437: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.086) 0:00:19.946 ***** 40074 1727204626.18466: entering _queue_task() for managed-node2/fail 40074 1727204626.18724: worker is 1 (out of 1 available) 40074 1727204626.18741: exiting _queue_task() for managed-node2/fail 40074 1727204626.18755: done queuing things up, now waiting for results queue to drain 40074 1727204626.18757: waiting for pending results... 40074 1727204626.18945: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 40074 1727204626.19048: in run() - task 12b410aa-8751-9fd7-2501-00000000001d 40074 1727204626.19062: variable 'ansible_search_path' from source: unknown 40074 1727204626.19066: variable 'ansible_search_path' from source: unknown 40074 1727204626.19108: calling self._execute() 40074 1727204626.19181: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204626.19187: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204626.19199: variable 'omit' from source: magic vars 40074 1727204626.19695: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.19699: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204626.19814: variable 'network_state' from source: role '' defaults 40074 1727204626.19833: Evaluated conditional (network_state != {}): False 40074 1727204626.19844: when evaluation is False, skipping this task 40074 1727204626.19853: _execute() done 40074 1727204626.19863: dumping result to json 40074 1727204626.19871: done dumping result, returning 40074 1727204626.19883: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-9fd7-2501-00000000001d] 40074 1727204626.19897: sending task result for task 12b410aa-8751-9fd7-2501-00000000001d 40074 1727204626.20029: done sending task result for task 12b410aa-8751-9fd7-2501-00000000001d skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204626.20085: no more pending results, returning what we have 40074 1727204626.20092: results queue empty 40074 1727204626.20093: checking for any_errors_fatal 40074 1727204626.20101: done checking for any_errors_fatal 40074 1727204626.20102: checking for max_fail_percentage 40074 1727204626.20104: done checking for max_fail_percentage 40074 1727204626.20105: checking to see if all hosts have failed and the running result is not ok 40074 1727204626.20107: done checking to see if all hosts have failed 40074 1727204626.20108: getting the remaining hosts for this loop 40074 1727204626.20110: done getting the remaining hosts for this loop 40074 1727204626.20114: getting the next task for host managed-node2 40074 1727204626.20121: done getting next task for host managed-node2 40074 1727204626.20125: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 40074 1727204626.20129: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204626.20272: WORKER PROCESS EXITING 40074 1727204626.20291: getting variables 40074 1727204626.20294: in VariableManager get_vars() 40074 1727204626.20477: Calling all_inventory to load vars for managed-node2 40074 1727204626.20480: Calling groups_inventory to load vars for managed-node2 40074 1727204626.20496: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204626.20506: Calling all_plugins_play to load vars for managed-node2 40074 1727204626.20510: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204626.20513: Calling groups_plugins_play to load vars for managed-node2 40074 1727204626.21803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204626.24051: done with get_vars() 40074 1727204626.24125: done getting variables 40074 1727204626.24221: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.058) 0:00:20.004 ***** 40074 1727204626.24286: entering _queue_task() for managed-node2/fail 40074 1727204626.24598: worker is 1 (out of 1 available) 40074 1727204626.24615: exiting _queue_task() for managed-node2/fail 40074 1727204626.24628: done queuing things up, now waiting for results queue to drain 40074 1727204626.24630: waiting for pending results... 40074 1727204626.24826: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 40074 1727204626.24942: in run() - task 12b410aa-8751-9fd7-2501-00000000001e 40074 1727204626.24956: variable 'ansible_search_path' from source: unknown 40074 1727204626.24962: variable 'ansible_search_path' from source: unknown 40074 1727204626.25001: calling self._execute() 40074 1727204626.25083: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204626.25095: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204626.25104: variable 'omit' from source: magic vars 40074 1727204626.25437: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.25449: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204626.25557: variable 'network_state' from source: role '' defaults 40074 1727204626.25568: Evaluated conditional (network_state != {}): False 40074 1727204626.25571: when evaluation is False, skipping this task 40074 1727204626.25575: _execute() done 40074 1727204626.25579: dumping result to json 40074 1727204626.25584: done dumping result, returning 40074 1727204626.25594: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-9fd7-2501-00000000001e] 40074 1727204626.25599: sending task result for task 12b410aa-8751-9fd7-2501-00000000001e 40074 1727204626.25703: done sending task result for task 12b410aa-8751-9fd7-2501-00000000001e 40074 1727204626.25706: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204626.25783: no more pending results, returning what we have 40074 1727204626.25787: results queue empty 40074 1727204626.25788: checking for any_errors_fatal 40074 1727204626.25799: done checking for any_errors_fatal 40074 1727204626.25800: checking for max_fail_percentage 40074 1727204626.25802: done checking for max_fail_percentage 40074 1727204626.25802: checking to see if all hosts have failed and the running result is not ok 40074 1727204626.25803: done checking to see if all hosts have failed 40074 1727204626.25804: getting the remaining hosts for this loop 40074 1727204626.25806: done getting the remaining hosts for this loop 40074 1727204626.25811: getting the next task for host managed-node2 40074 1727204626.25819: done getting next task for host managed-node2 40074 1727204626.25824: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 40074 1727204626.25827: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204626.25845: getting variables 40074 1727204626.25847: in VariableManager get_vars() 40074 1727204626.25888: Calling all_inventory to load vars for managed-node2 40074 1727204626.25899: Calling groups_inventory to load vars for managed-node2 40074 1727204626.25901: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204626.25912: Calling all_plugins_play to load vars for managed-node2 40074 1727204626.25915: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204626.25919: Calling groups_plugins_play to load vars for managed-node2 40074 1727204626.30928: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204626.32805: done with get_vars() 40074 1727204626.32830: done getting variables 40074 1727204626.32872: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.086) 0:00:20.090 ***** 40074 1727204626.32899: entering _queue_task() for managed-node2/fail 40074 1727204626.33157: worker is 1 (out of 1 available) 40074 1727204626.33174: exiting _queue_task() for managed-node2/fail 40074 1727204626.33186: done queuing things up, now waiting for results queue to drain 40074 1727204626.33188: waiting for pending results... 40074 1727204626.33377: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 40074 1727204626.33494: in run() - task 12b410aa-8751-9fd7-2501-00000000001f 40074 1727204626.33506: variable 'ansible_search_path' from source: unknown 40074 1727204626.33511: variable 'ansible_search_path' from source: unknown 40074 1727204626.33551: calling self._execute() 40074 1727204626.33637: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204626.33643: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204626.33651: variable 'omit' from source: magic vars 40074 1727204626.33980: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.33992: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204626.34144: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204626.35940: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204626.36004: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204626.36041: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204626.36072: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204626.36096: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204626.36169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.36193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.36215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.36253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.36267: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.36347: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.36360: Evaluated conditional (ansible_distribution_major_version | int > 9): True 40074 1727204626.36458: variable 'ansible_distribution' from source: facts 40074 1727204626.36462: variable '__network_rh_distros' from source: role '' defaults 40074 1727204626.36473: Evaluated conditional (ansible_distribution in __network_rh_distros): False 40074 1727204626.36476: when evaluation is False, skipping this task 40074 1727204626.36479: _execute() done 40074 1727204626.36482: dumping result to json 40074 1727204626.36494: done dumping result, returning 40074 1727204626.36499: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-9fd7-2501-00000000001f] 40074 1727204626.36504: sending task result for task 12b410aa-8751-9fd7-2501-00000000001f 40074 1727204626.36599: done sending task result for task 12b410aa-8751-9fd7-2501-00000000001f 40074 1727204626.36602: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 40074 1727204626.36649: no more pending results, returning what we have 40074 1727204626.36652: results queue empty 40074 1727204626.36653: checking for any_errors_fatal 40074 1727204626.36663: done checking for any_errors_fatal 40074 1727204626.36664: checking for max_fail_percentage 40074 1727204626.36665: done checking for max_fail_percentage 40074 1727204626.36666: checking to see if all hosts have failed and the running result is not ok 40074 1727204626.36667: done checking to see if all hosts have failed 40074 1727204626.36668: getting the remaining hosts for this loop 40074 1727204626.36670: done getting the remaining hosts for this loop 40074 1727204626.36674: getting the next task for host managed-node2 40074 1727204626.36681: done getting next task for host managed-node2 40074 1727204626.36686: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 40074 1727204626.36690: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204626.36707: getting variables 40074 1727204626.36709: in VariableManager get_vars() 40074 1727204626.36751: Calling all_inventory to load vars for managed-node2 40074 1727204626.36755: Calling groups_inventory to load vars for managed-node2 40074 1727204626.36758: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204626.36767: Calling all_plugins_play to load vars for managed-node2 40074 1727204626.36770: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204626.36774: Calling groups_plugins_play to load vars for managed-node2 40074 1727204626.38124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204626.39725: done with get_vars() 40074 1727204626.39750: done getting variables 40074 1727204626.39830: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.069) 0:00:20.160 ***** 40074 1727204626.39857: entering _queue_task() for managed-node2/dnf 40074 1727204626.40100: worker is 1 (out of 1 available) 40074 1727204626.40116: exiting _queue_task() for managed-node2/dnf 40074 1727204626.40127: done queuing things up, now waiting for results queue to drain 40074 1727204626.40129: waiting for pending results... 40074 1727204626.40313: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 40074 1727204626.40434: in run() - task 12b410aa-8751-9fd7-2501-000000000020 40074 1727204626.40447: variable 'ansible_search_path' from source: unknown 40074 1727204626.40450: variable 'ansible_search_path' from source: unknown 40074 1727204626.40487: calling self._execute() 40074 1727204626.40569: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204626.40574: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204626.40587: variable 'omit' from source: magic vars 40074 1727204626.40920: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.40929: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204626.41105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204626.42896: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204626.42958: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204626.42992: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204626.43024: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204626.43046: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204626.43119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.43144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.43168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.43205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.43219: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.43314: variable 'ansible_distribution' from source: facts 40074 1727204626.43317: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.43325: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 40074 1727204626.43418: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204626.43533: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.43557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.43579: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.43612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.43629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.43667: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.43686: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.43709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.43745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.43759: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.43794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.43813: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.43838: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.43872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.43885: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.44018: variable 'network_connections' from source: task vars 40074 1727204626.44031: variable 'interface0' from source: play vars 40074 1727204626.44091: variable 'interface0' from source: play vars 40074 1727204626.44099: variable 'interface0' from source: play vars 40074 1727204626.44151: variable 'interface0' from source: play vars 40074 1727204626.44163: variable 'interface1' from source: play vars 40074 1727204626.44219: variable 'interface1' from source: play vars 40074 1727204626.44228: variable 'interface1' from source: play vars 40074 1727204626.44279: variable 'interface1' from source: play vars 40074 1727204626.44345: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204626.44492: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204626.44616: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204626.44621: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204626.44624: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204626.44637: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204626.44640: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204626.44661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.44682: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204626.44739: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204626.44939: variable 'network_connections' from source: task vars 40074 1727204626.44943: variable 'interface0' from source: play vars 40074 1727204626.44997: variable 'interface0' from source: play vars 40074 1727204626.45004: variable 'interface0' from source: play vars 40074 1727204626.45057: variable 'interface0' from source: play vars 40074 1727204626.45068: variable 'interface1' from source: play vars 40074 1727204626.45121: variable 'interface1' from source: play vars 40074 1727204626.45130: variable 'interface1' from source: play vars 40074 1727204626.45181: variable 'interface1' from source: play vars 40074 1727204626.45216: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 40074 1727204626.45219: when evaluation is False, skipping this task 40074 1727204626.45226: _execute() done 40074 1727204626.45231: dumping result to json 40074 1727204626.45235: done dumping result, returning 40074 1727204626.45242: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-9fd7-2501-000000000020] 40074 1727204626.45248: sending task result for task 12b410aa-8751-9fd7-2501-000000000020 40074 1727204626.45340: done sending task result for task 12b410aa-8751-9fd7-2501-000000000020 40074 1727204626.45343: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 40074 1727204626.45399: no more pending results, returning what we have 40074 1727204626.45403: results queue empty 40074 1727204626.45404: checking for any_errors_fatal 40074 1727204626.45413: done checking for any_errors_fatal 40074 1727204626.45414: checking for max_fail_percentage 40074 1727204626.45416: done checking for max_fail_percentage 40074 1727204626.45417: checking to see if all hosts have failed and the running result is not ok 40074 1727204626.45418: done checking to see if all hosts have failed 40074 1727204626.45419: getting the remaining hosts for this loop 40074 1727204626.45420: done getting the remaining hosts for this loop 40074 1727204626.45425: getting the next task for host managed-node2 40074 1727204626.45432: done getting next task for host managed-node2 40074 1727204626.45437: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 40074 1727204626.45440: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204626.45458: getting variables 40074 1727204626.45459: in VariableManager get_vars() 40074 1727204626.45503: Calling all_inventory to load vars for managed-node2 40074 1727204626.45506: Calling groups_inventory to load vars for managed-node2 40074 1727204626.45509: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204626.45518: Calling all_plugins_play to load vars for managed-node2 40074 1727204626.45521: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204626.45525: Calling groups_plugins_play to load vars for managed-node2 40074 1727204626.46815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204626.48406: done with get_vars() 40074 1727204626.48432: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 40074 1727204626.48494: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.086) 0:00:20.246 ***** 40074 1727204626.48518: entering _queue_task() for managed-node2/yum 40074 1727204626.48520: Creating lock for yum 40074 1727204626.48767: worker is 1 (out of 1 available) 40074 1727204626.48783: exiting _queue_task() for managed-node2/yum 40074 1727204626.48796: done queuing things up, now waiting for results queue to drain 40074 1727204626.48798: waiting for pending results... 40074 1727204626.48985: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 40074 1727204626.49094: in run() - task 12b410aa-8751-9fd7-2501-000000000021 40074 1727204626.49107: variable 'ansible_search_path' from source: unknown 40074 1727204626.49111: variable 'ansible_search_path' from source: unknown 40074 1727204626.49148: calling self._execute() 40074 1727204626.49231: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204626.49243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204626.49253: variable 'omit' from source: magic vars 40074 1727204626.49603: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.49606: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204626.49756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204626.52211: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204626.52322: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204626.52370: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204626.52404: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204626.52430: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204626.52512: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.52538: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.52559: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.52597: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.52610: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.52689: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.52706: Evaluated conditional (ansible_distribution_major_version | int < 8): False 40074 1727204626.52709: when evaluation is False, skipping this task 40074 1727204626.52712: _execute() done 40074 1727204626.52718: dumping result to json 40074 1727204626.52725: done dumping result, returning 40074 1727204626.52732: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-9fd7-2501-000000000021] 40074 1727204626.52738: sending task result for task 12b410aa-8751-9fd7-2501-000000000021 40074 1727204626.52841: done sending task result for task 12b410aa-8751-9fd7-2501-000000000021 40074 1727204626.52845: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 40074 1727204626.52903: no more pending results, returning what we have 40074 1727204626.52908: results queue empty 40074 1727204626.52909: checking for any_errors_fatal 40074 1727204626.52919: done checking for any_errors_fatal 40074 1727204626.52920: checking for max_fail_percentage 40074 1727204626.52922: done checking for max_fail_percentage 40074 1727204626.52923: checking to see if all hosts have failed and the running result is not ok 40074 1727204626.52924: done checking to see if all hosts have failed 40074 1727204626.52925: getting the remaining hosts for this loop 40074 1727204626.52926: done getting the remaining hosts for this loop 40074 1727204626.52931: getting the next task for host managed-node2 40074 1727204626.52937: done getting next task for host managed-node2 40074 1727204626.52942: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 40074 1727204626.52946: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204626.52963: getting variables 40074 1727204626.52965: in VariableManager get_vars() 40074 1727204626.53015: Calling all_inventory to load vars for managed-node2 40074 1727204626.53018: Calling groups_inventory to load vars for managed-node2 40074 1727204626.53021: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204626.53032: Calling all_plugins_play to load vars for managed-node2 40074 1727204626.53036: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204626.53040: Calling groups_plugins_play to load vars for managed-node2 40074 1727204626.54549: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204626.57563: done with get_vars() 40074 1727204626.57610: done getting variables 40074 1727204626.57682: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.092) 0:00:20.339 ***** 40074 1727204626.57727: entering _queue_task() for managed-node2/fail 40074 1727204626.58291: worker is 1 (out of 1 available) 40074 1727204626.58304: exiting _queue_task() for managed-node2/fail 40074 1727204626.58314: done queuing things up, now waiting for results queue to drain 40074 1727204626.58316: waiting for pending results... 40074 1727204626.58510: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 40074 1727204626.58654: in run() - task 12b410aa-8751-9fd7-2501-000000000022 40074 1727204626.58658: variable 'ansible_search_path' from source: unknown 40074 1727204626.58661: variable 'ansible_search_path' from source: unknown 40074 1727204626.58693: calling self._execute() 40074 1727204626.58810: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204626.58827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204626.58868: variable 'omit' from source: magic vars 40074 1727204626.59309: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.59331: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204626.59494: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204626.59852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204626.62588: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204626.62688: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204626.62795: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204626.62837: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204626.62877: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204626.62986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.63130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.63176: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.63475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.63479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.63482: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.63484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.63487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.63551: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.63577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.63646: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.63683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.63730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.63785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.63815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.64096: variable 'network_connections' from source: task vars 40074 1727204626.64122: variable 'interface0' from source: play vars 40074 1727204626.64229: variable 'interface0' from source: play vars 40074 1727204626.64246: variable 'interface0' from source: play vars 40074 1727204626.64338: variable 'interface0' from source: play vars 40074 1727204626.64361: variable 'interface1' from source: play vars 40074 1727204626.64448: variable 'interface1' from source: play vars 40074 1727204626.64463: variable 'interface1' from source: play vars 40074 1727204626.64549: variable 'interface1' from source: play vars 40074 1727204626.64661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204626.65098: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204626.65160: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204626.65209: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204626.65256: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204626.65353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204626.65357: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204626.65396: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.65438: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204626.65527: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204626.65915: variable 'network_connections' from source: task vars 40074 1727204626.65931: variable 'interface0' from source: play vars 40074 1727204626.66135: variable 'interface0' from source: play vars 40074 1727204626.66139: variable 'interface0' from source: play vars 40074 1727204626.66269: variable 'interface0' from source: play vars 40074 1727204626.66396: variable 'interface1' from source: play vars 40074 1727204626.66452: variable 'interface1' from source: play vars 40074 1727204626.66509: variable 'interface1' from source: play vars 40074 1727204626.66652: variable 'interface1' from source: play vars 40074 1727204626.66907: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 40074 1727204626.66910: when evaluation is False, skipping this task 40074 1727204626.66913: _execute() done 40074 1727204626.66915: dumping result to json 40074 1727204626.66920: done dumping result, returning 40074 1727204626.66923: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-9fd7-2501-000000000022] 40074 1727204626.66926: sending task result for task 12b410aa-8751-9fd7-2501-000000000022 40074 1727204626.67207: done sending task result for task 12b410aa-8751-9fd7-2501-000000000022 40074 1727204626.67211: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 40074 1727204626.67275: no more pending results, returning what we have 40074 1727204626.67280: results queue empty 40074 1727204626.67282: checking for any_errors_fatal 40074 1727204626.67293: done checking for any_errors_fatal 40074 1727204626.67294: checking for max_fail_percentage 40074 1727204626.67297: done checking for max_fail_percentage 40074 1727204626.67298: checking to see if all hosts have failed and the running result is not ok 40074 1727204626.67299: done checking to see if all hosts have failed 40074 1727204626.67300: getting the remaining hosts for this loop 40074 1727204626.67302: done getting the remaining hosts for this loop 40074 1727204626.67307: getting the next task for host managed-node2 40074 1727204626.67314: done getting next task for host managed-node2 40074 1727204626.67322: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 40074 1727204626.67327: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204626.67346: getting variables 40074 1727204626.67349: in VariableManager get_vars() 40074 1727204626.67805: Calling all_inventory to load vars for managed-node2 40074 1727204626.67809: Calling groups_inventory to load vars for managed-node2 40074 1727204626.67812: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204626.67825: Calling all_plugins_play to load vars for managed-node2 40074 1727204626.67829: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204626.67833: Calling groups_plugins_play to load vars for managed-node2 40074 1727204626.70942: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204626.73900: done with get_vars() 40074 1727204626.73938: done getting variables 40074 1727204626.74015: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.163) 0:00:20.502 ***** 40074 1727204626.74055: entering _queue_task() for managed-node2/package 40074 1727204626.74380: worker is 1 (out of 1 available) 40074 1727204626.74602: exiting _queue_task() for managed-node2/package 40074 1727204626.74613: done queuing things up, now waiting for results queue to drain 40074 1727204626.74614: waiting for pending results... 40074 1727204626.74711: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 40074 1727204626.74884: in run() - task 12b410aa-8751-9fd7-2501-000000000023 40074 1727204626.74907: variable 'ansible_search_path' from source: unknown 40074 1727204626.74919: variable 'ansible_search_path' from source: unknown 40074 1727204626.75059: calling self._execute() 40074 1727204626.75078: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204626.75093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204626.75110: variable 'omit' from source: magic vars 40074 1727204626.75571: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.75592: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204626.75856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204626.76252: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204626.76257: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204626.76304: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204626.76391: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204626.76538: variable 'network_packages' from source: role '' defaults 40074 1727204626.76688: variable '__network_provider_setup' from source: role '' defaults 40074 1727204626.76707: variable '__network_service_name_default_nm' from source: role '' defaults 40074 1727204626.76803: variable '__network_service_name_default_nm' from source: role '' defaults 40074 1727204626.76821: variable '__network_packages_default_nm' from source: role '' defaults 40074 1727204626.76992: variable '__network_packages_default_nm' from source: role '' defaults 40074 1727204626.77174: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204626.79603: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204626.79693: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204626.79750: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204626.79828: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204626.79841: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204626.79961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.80007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.80054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.80154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.80157: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.80207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.80247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.80291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.80352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.80382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.80794: variable '__network_packages_default_gobject_packages' from source: role '' defaults 40074 1727204626.80866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.80903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.80948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.81006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.81037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.81160: variable 'ansible_python' from source: facts 40074 1727204626.81198: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 40074 1727204626.81312: variable '__network_wpa_supplicant_required' from source: role '' defaults 40074 1727204626.81425: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 40074 1727204626.81608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.81647: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.81688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.81749: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.81773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.81895: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204626.81907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204626.81929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.81985: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204626.82012: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204626.82211: variable 'network_connections' from source: task vars 40074 1727204626.82228: variable 'interface0' from source: play vars 40074 1727204626.82396: variable 'interface0' from source: play vars 40074 1727204626.82400: variable 'interface0' from source: play vars 40074 1727204626.82508: variable 'interface0' from source: play vars 40074 1727204626.82536: variable 'interface1' from source: play vars 40074 1727204626.82669: variable 'interface1' from source: play vars 40074 1727204626.82696: variable 'interface1' from source: play vars 40074 1727204626.82831: variable 'interface1' from source: play vars 40074 1727204626.82999: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204626.83002: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204626.83022: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204626.83067: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204626.83130: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204626.83529: variable 'network_connections' from source: task vars 40074 1727204626.83540: variable 'interface0' from source: play vars 40074 1727204626.83668: variable 'interface0' from source: play vars 40074 1727204626.83682: variable 'interface0' from source: play vars 40074 1727204626.83807: variable 'interface0' from source: play vars 40074 1727204626.83834: variable 'interface1' from source: play vars 40074 1727204626.83959: variable 'interface1' from source: play vars 40074 1727204626.83998: variable 'interface1' from source: play vars 40074 1727204626.84112: variable 'interface1' from source: play vars 40074 1727204626.84199: variable '__network_packages_default_wireless' from source: role '' defaults 40074 1727204626.84497: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204626.84753: variable 'network_connections' from source: task vars 40074 1727204626.84765: variable 'interface0' from source: play vars 40074 1727204626.84853: variable 'interface0' from source: play vars 40074 1727204626.84867: variable 'interface0' from source: play vars 40074 1727204626.84957: variable 'interface0' from source: play vars 40074 1727204626.84977: variable 'interface1' from source: play vars 40074 1727204626.85067: variable 'interface1' from source: play vars 40074 1727204626.85080: variable 'interface1' from source: play vars 40074 1727204626.85168: variable 'interface1' from source: play vars 40074 1727204626.85211: variable '__network_packages_default_team' from source: role '' defaults 40074 1727204626.85324: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204626.85754: variable 'network_connections' from source: task vars 40074 1727204626.85766: variable 'interface0' from source: play vars 40074 1727204626.85853: variable 'interface0' from source: play vars 40074 1727204626.85866: variable 'interface0' from source: play vars 40074 1727204626.85957: variable 'interface0' from source: play vars 40074 1727204626.85976: variable 'interface1' from source: play vars 40074 1727204626.86070: variable 'interface1' from source: play vars 40074 1727204626.86083: variable 'interface1' from source: play vars 40074 1727204626.86171: variable 'interface1' from source: play vars 40074 1727204626.86269: variable '__network_service_name_default_initscripts' from source: role '' defaults 40074 1727204626.86355: variable '__network_service_name_default_initscripts' from source: role '' defaults 40074 1727204626.86369: variable '__network_packages_default_initscripts' from source: role '' defaults 40074 1727204626.86449: variable '__network_packages_default_initscripts' from source: role '' defaults 40074 1727204626.86766: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 40074 1727204626.87497: variable 'network_connections' from source: task vars 40074 1727204626.87502: variable 'interface0' from source: play vars 40074 1727204626.87568: variable 'interface0' from source: play vars 40074 1727204626.87574: variable 'interface0' from source: play vars 40074 1727204626.87628: variable 'interface0' from source: play vars 40074 1727204626.87639: variable 'interface1' from source: play vars 40074 1727204626.87694: variable 'interface1' from source: play vars 40074 1727204626.87713: variable 'interface1' from source: play vars 40074 1727204626.87762: variable 'interface1' from source: play vars 40074 1727204626.87775: variable 'ansible_distribution' from source: facts 40074 1727204626.87780: variable '__network_rh_distros' from source: role '' defaults 40074 1727204626.87788: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.87810: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 40074 1727204626.87954: variable 'ansible_distribution' from source: facts 40074 1727204626.87958: variable '__network_rh_distros' from source: role '' defaults 40074 1727204626.87964: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.87971: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 40074 1727204626.88114: variable 'ansible_distribution' from source: facts 40074 1727204626.88118: variable '__network_rh_distros' from source: role '' defaults 40074 1727204626.88127: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.88157: variable 'network_provider' from source: set_fact 40074 1727204626.88171: variable 'ansible_facts' from source: unknown 40074 1727204626.88865: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 40074 1727204626.88870: when evaluation is False, skipping this task 40074 1727204626.88872: _execute() done 40074 1727204626.88875: dumping result to json 40074 1727204626.88880: done dumping result, returning 40074 1727204626.88890: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-9fd7-2501-000000000023] 40074 1727204626.88895: sending task result for task 12b410aa-8751-9fd7-2501-000000000023 40074 1727204626.89001: done sending task result for task 12b410aa-8751-9fd7-2501-000000000023 40074 1727204626.89004: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 40074 1727204626.89062: no more pending results, returning what we have 40074 1727204626.89065: results queue empty 40074 1727204626.89071: checking for any_errors_fatal 40074 1727204626.89079: done checking for any_errors_fatal 40074 1727204626.89080: checking for max_fail_percentage 40074 1727204626.89081: done checking for max_fail_percentage 40074 1727204626.89082: checking to see if all hosts have failed and the running result is not ok 40074 1727204626.89084: done checking to see if all hosts have failed 40074 1727204626.89084: getting the remaining hosts for this loop 40074 1727204626.89086: done getting the remaining hosts for this loop 40074 1727204626.89092: getting the next task for host managed-node2 40074 1727204626.89101: done getting next task for host managed-node2 40074 1727204626.89105: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 40074 1727204626.89109: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204626.89133: getting variables 40074 1727204626.89135: in VariableManager get_vars() 40074 1727204626.89179: Calling all_inventory to load vars for managed-node2 40074 1727204626.89183: Calling groups_inventory to load vars for managed-node2 40074 1727204626.89185: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204626.89197: Calling all_plugins_play to load vars for managed-node2 40074 1727204626.89200: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204626.89203: Calling groups_plugins_play to load vars for managed-node2 40074 1727204626.90985: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204626.92614: done with get_vars() 40074 1727204626.92652: done getting variables 40074 1727204626.92726: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.187) 0:00:20.689 ***** 40074 1727204626.92767: entering _queue_task() for managed-node2/package 40074 1727204626.93097: worker is 1 (out of 1 available) 40074 1727204626.93112: exiting _queue_task() for managed-node2/package 40074 1727204626.93125: done queuing things up, now waiting for results queue to drain 40074 1727204626.93127: waiting for pending results... 40074 1727204626.93520: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 40074 1727204626.93695: in run() - task 12b410aa-8751-9fd7-2501-000000000024 40074 1727204626.93700: variable 'ansible_search_path' from source: unknown 40074 1727204626.93703: variable 'ansible_search_path' from source: unknown 40074 1727204626.93710: calling self._execute() 40074 1727204626.93824: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204626.93845: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204626.93894: variable 'omit' from source: magic vars 40074 1727204626.94272: variable 'ansible_distribution_major_version' from source: facts 40074 1727204626.94284: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204626.94390: variable 'network_state' from source: role '' defaults 40074 1727204626.94399: Evaluated conditional (network_state != {}): False 40074 1727204626.94403: when evaluation is False, skipping this task 40074 1727204626.94407: _execute() done 40074 1727204626.94412: dumping result to json 40074 1727204626.94416: done dumping result, returning 40074 1727204626.94426: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-9fd7-2501-000000000024] 40074 1727204626.94431: sending task result for task 12b410aa-8751-9fd7-2501-000000000024 40074 1727204626.94532: done sending task result for task 12b410aa-8751-9fd7-2501-000000000024 40074 1727204626.94535: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204626.94590: no more pending results, returning what we have 40074 1727204626.94595: results queue empty 40074 1727204626.94596: checking for any_errors_fatal 40074 1727204626.94606: done checking for any_errors_fatal 40074 1727204626.94607: checking for max_fail_percentage 40074 1727204626.94609: done checking for max_fail_percentage 40074 1727204626.94610: checking to see if all hosts have failed and the running result is not ok 40074 1727204626.94611: done checking to see if all hosts have failed 40074 1727204626.94612: getting the remaining hosts for this loop 40074 1727204626.94614: done getting the remaining hosts for this loop 40074 1727204626.94620: getting the next task for host managed-node2 40074 1727204626.94627: done getting next task for host managed-node2 40074 1727204626.94632: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 40074 1727204626.94635: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204626.94653: getting variables 40074 1727204626.94655: in VariableManager get_vars() 40074 1727204626.94700: Calling all_inventory to load vars for managed-node2 40074 1727204626.94704: Calling groups_inventory to load vars for managed-node2 40074 1727204626.94706: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204626.94719: Calling all_plugins_play to load vars for managed-node2 40074 1727204626.94723: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204626.94726: Calling groups_plugins_play to load vars for managed-node2 40074 1727204626.96328: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204626.98734: done with get_vars() 40074 1727204626.98763: done getting variables 40074 1727204626.98820: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:46 -0400 (0:00:00.060) 0:00:20.750 ***** 40074 1727204626.98848: entering _queue_task() for managed-node2/package 40074 1727204626.99103: worker is 1 (out of 1 available) 40074 1727204626.99118: exiting _queue_task() for managed-node2/package 40074 1727204626.99131: done queuing things up, now waiting for results queue to drain 40074 1727204626.99133: waiting for pending results... 40074 1727204626.99324: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 40074 1727204626.99429: in run() - task 12b410aa-8751-9fd7-2501-000000000025 40074 1727204626.99443: variable 'ansible_search_path' from source: unknown 40074 1727204626.99447: variable 'ansible_search_path' from source: unknown 40074 1727204626.99481: calling self._execute() 40074 1727204626.99569: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204626.99582: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204626.99587: variable 'omit' from source: magic vars 40074 1727204627.00055: variable 'ansible_distribution_major_version' from source: facts 40074 1727204627.00059: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204627.00302: variable 'network_state' from source: role '' defaults 40074 1727204627.00306: Evaluated conditional (network_state != {}): False 40074 1727204627.00309: when evaluation is False, skipping this task 40074 1727204627.00311: _execute() done 40074 1727204627.00313: dumping result to json 40074 1727204627.00315: done dumping result, returning 40074 1727204627.00318: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-9fd7-2501-000000000025] 40074 1727204627.00320: sending task result for task 12b410aa-8751-9fd7-2501-000000000025 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204627.00462: no more pending results, returning what we have 40074 1727204627.00467: results queue empty 40074 1727204627.00468: checking for any_errors_fatal 40074 1727204627.00477: done checking for any_errors_fatal 40074 1727204627.00478: checking for max_fail_percentage 40074 1727204627.00480: done checking for max_fail_percentage 40074 1727204627.00481: checking to see if all hosts have failed and the running result is not ok 40074 1727204627.00482: done checking to see if all hosts have failed 40074 1727204627.00483: getting the remaining hosts for this loop 40074 1727204627.00485: done getting the remaining hosts for this loop 40074 1727204627.00492: getting the next task for host managed-node2 40074 1727204627.00501: done getting next task for host managed-node2 40074 1727204627.00505: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 40074 1727204627.00511: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204627.00531: getting variables 40074 1727204627.00533: in VariableManager get_vars() 40074 1727204627.00581: Calling all_inventory to load vars for managed-node2 40074 1727204627.00585: Calling groups_inventory to load vars for managed-node2 40074 1727204627.00588: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204627.00768: Calling all_plugins_play to load vars for managed-node2 40074 1727204627.00773: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204627.00777: Calling groups_plugins_play to load vars for managed-node2 40074 1727204627.01328: done sending task result for task 12b410aa-8751-9fd7-2501-000000000025 40074 1727204627.01332: WORKER PROCESS EXITING 40074 1727204627.02281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204627.03883: done with get_vars() 40074 1727204627.03912: done getting variables 40074 1727204627.04004: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:47 -0400 (0:00:00.051) 0:00:20.802 ***** 40074 1727204627.04032: entering _queue_task() for managed-node2/service 40074 1727204627.04034: Creating lock for service 40074 1727204627.04309: worker is 1 (out of 1 available) 40074 1727204627.04325: exiting _queue_task() for managed-node2/service 40074 1727204627.04338: done queuing things up, now waiting for results queue to drain 40074 1727204627.04340: waiting for pending results... 40074 1727204627.04533: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 40074 1727204627.04637: in run() - task 12b410aa-8751-9fd7-2501-000000000026 40074 1727204627.04651: variable 'ansible_search_path' from source: unknown 40074 1727204627.04655: variable 'ansible_search_path' from source: unknown 40074 1727204627.04694: calling self._execute() 40074 1727204627.04778: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204627.04784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204627.04798: variable 'omit' from source: magic vars 40074 1727204627.05130: variable 'ansible_distribution_major_version' from source: facts 40074 1727204627.05141: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204627.05247: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204627.05427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204627.07210: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204627.07273: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204627.07308: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204627.07342: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204627.07364: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204627.07439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204627.07463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204627.07484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204627.07525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204627.07540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204627.07580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204627.07602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204627.07632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204627.07663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204627.07677: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204627.07713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204627.07740: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204627.07761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204627.07793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204627.07806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204627.07952: variable 'network_connections' from source: task vars 40074 1727204627.07964: variable 'interface0' from source: play vars 40074 1727204627.08028: variable 'interface0' from source: play vars 40074 1727204627.08036: variable 'interface0' from source: play vars 40074 1727204627.08092: variable 'interface0' from source: play vars 40074 1727204627.08105: variable 'interface1' from source: play vars 40074 1727204627.08160: variable 'interface1' from source: play vars 40074 1727204627.08167: variable 'interface1' from source: play vars 40074 1727204627.08219: variable 'interface1' from source: play vars 40074 1727204627.08288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204627.08433: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204627.08464: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204627.08494: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204627.08521: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204627.08559: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204627.08577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204627.08603: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204627.08629: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204627.08681: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204627.08886: variable 'network_connections' from source: task vars 40074 1727204627.08892: variable 'interface0' from source: play vars 40074 1727204627.08948: variable 'interface0' from source: play vars 40074 1727204627.08955: variable 'interface0' from source: play vars 40074 1727204627.09005: variable 'interface0' from source: play vars 40074 1727204627.09017: variable 'interface1' from source: play vars 40074 1727204627.09072: variable 'interface1' from source: play vars 40074 1727204627.09085: variable 'interface1' from source: play vars 40074 1727204627.09136: variable 'interface1' from source: play vars 40074 1727204627.09170: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 40074 1727204627.09173: when evaluation is False, skipping this task 40074 1727204627.09176: _execute() done 40074 1727204627.09182: dumping result to json 40074 1727204627.09186: done dumping result, returning 40074 1727204627.09195: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-9fd7-2501-000000000026] 40074 1727204627.09200: sending task result for task 12b410aa-8751-9fd7-2501-000000000026 40074 1727204627.09298: done sending task result for task 12b410aa-8751-9fd7-2501-000000000026 40074 1727204627.09301: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 40074 1727204627.09352: no more pending results, returning what we have 40074 1727204627.09356: results queue empty 40074 1727204627.09357: checking for any_errors_fatal 40074 1727204627.09366: done checking for any_errors_fatal 40074 1727204627.09367: checking for max_fail_percentage 40074 1727204627.09369: done checking for max_fail_percentage 40074 1727204627.09369: checking to see if all hosts have failed and the running result is not ok 40074 1727204627.09371: done checking to see if all hosts have failed 40074 1727204627.09371: getting the remaining hosts for this loop 40074 1727204627.09373: done getting the remaining hosts for this loop 40074 1727204627.09377: getting the next task for host managed-node2 40074 1727204627.09385: done getting next task for host managed-node2 40074 1727204627.09392: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 40074 1727204627.09395: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204627.09413: getting variables 40074 1727204627.09415: in VariableManager get_vars() 40074 1727204627.09460: Calling all_inventory to load vars for managed-node2 40074 1727204627.09463: Calling groups_inventory to load vars for managed-node2 40074 1727204627.09466: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204627.09476: Calling all_plugins_play to load vars for managed-node2 40074 1727204627.09479: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204627.09482: Calling groups_plugins_play to load vars for managed-node2 40074 1727204627.10768: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204627.12367: done with get_vars() 40074 1727204627.12394: done getting variables 40074 1727204627.12449: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:47 -0400 (0:00:00.084) 0:00:20.886 ***** 40074 1727204627.12476: entering _queue_task() for managed-node2/service 40074 1727204627.12735: worker is 1 (out of 1 available) 40074 1727204627.12751: exiting _queue_task() for managed-node2/service 40074 1727204627.12765: done queuing things up, now waiting for results queue to drain 40074 1727204627.12767: waiting for pending results... 40074 1727204627.12961: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 40074 1727204627.13070: in run() - task 12b410aa-8751-9fd7-2501-000000000027 40074 1727204627.13085: variable 'ansible_search_path' from source: unknown 40074 1727204627.13090: variable 'ansible_search_path' from source: unknown 40074 1727204627.13127: calling self._execute() 40074 1727204627.13207: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204627.13213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204627.13229: variable 'omit' from source: magic vars 40074 1727204627.13557: variable 'ansible_distribution_major_version' from source: facts 40074 1727204627.13566: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204627.13709: variable 'network_provider' from source: set_fact 40074 1727204627.13713: variable 'network_state' from source: role '' defaults 40074 1727204627.13727: Evaluated conditional (network_provider == "nm" or network_state != {}): True 40074 1727204627.13734: variable 'omit' from source: magic vars 40074 1727204627.13783: variable 'omit' from source: magic vars 40074 1727204627.13810: variable 'network_service_name' from source: role '' defaults 40074 1727204627.13874: variable 'network_service_name' from source: role '' defaults 40074 1727204627.13970: variable '__network_provider_setup' from source: role '' defaults 40074 1727204627.13976: variable '__network_service_name_default_nm' from source: role '' defaults 40074 1727204627.14036: variable '__network_service_name_default_nm' from source: role '' defaults 40074 1727204627.14045: variable '__network_packages_default_nm' from source: role '' defaults 40074 1727204627.14102: variable '__network_packages_default_nm' from source: role '' defaults 40074 1727204627.14297: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204627.16258: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204627.16326: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204627.16359: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204627.16391: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204627.16418: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204627.16490: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204627.16519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204627.16543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204627.16576: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204627.16590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204627.16636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204627.16656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204627.16678: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204627.16713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204627.16731: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204627.16926: variable '__network_packages_default_gobject_packages' from source: role '' defaults 40074 1727204627.17029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204627.17053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204627.17074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204627.17107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204627.17119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204627.17200: variable 'ansible_python' from source: facts 40074 1727204627.17220: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 40074 1727204627.17294: variable '__network_wpa_supplicant_required' from source: role '' defaults 40074 1727204627.17362: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 40074 1727204627.17471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204627.17496: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204627.17517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204627.17550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204627.17563: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204627.17609: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204627.17635: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204627.17655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204627.17685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204627.17701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204627.17819: variable 'network_connections' from source: task vars 40074 1727204627.17827: variable 'interface0' from source: play vars 40074 1727204627.17888: variable 'interface0' from source: play vars 40074 1727204627.17900: variable 'interface0' from source: play vars 40074 1727204627.17966: variable 'interface0' from source: play vars 40074 1727204627.17991: variable 'interface1' from source: play vars 40074 1727204627.18055: variable 'interface1' from source: play vars 40074 1727204627.18066: variable 'interface1' from source: play vars 40074 1727204627.18131: variable 'interface1' from source: play vars 40074 1727204627.18229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204627.18396: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204627.18439: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204627.18476: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204627.18516: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204627.18566: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204627.18602: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204627.18630: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204627.18657: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204627.18703: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204627.18958: variable 'network_connections' from source: task vars 40074 1727204627.18965: variable 'interface0' from source: play vars 40074 1727204627.19031: variable 'interface0' from source: play vars 40074 1727204627.19041: variable 'interface0' from source: play vars 40074 1727204627.19100: variable 'interface0' from source: play vars 40074 1727204627.19129: variable 'interface1' from source: play vars 40074 1727204627.19187: variable 'interface1' from source: play vars 40074 1727204627.19198: variable 'interface1' from source: play vars 40074 1727204627.19261: variable 'interface1' from source: play vars 40074 1727204627.19310: variable '__network_packages_default_wireless' from source: role '' defaults 40074 1727204627.19379: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204627.19623: variable 'network_connections' from source: task vars 40074 1727204627.19626: variable 'interface0' from source: play vars 40074 1727204627.19686: variable 'interface0' from source: play vars 40074 1727204627.19694: variable 'interface0' from source: play vars 40074 1727204627.19752: variable 'interface0' from source: play vars 40074 1727204627.19763: variable 'interface1' from source: play vars 40074 1727204627.19826: variable 'interface1' from source: play vars 40074 1727204627.19833: variable 'interface1' from source: play vars 40074 1727204627.19893: variable 'interface1' from source: play vars 40074 1727204627.19921: variable '__network_packages_default_team' from source: role '' defaults 40074 1727204627.19984: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204627.20229: variable 'network_connections' from source: task vars 40074 1727204627.20232: variable 'interface0' from source: play vars 40074 1727204627.20287: variable 'interface0' from source: play vars 40074 1727204627.20295: variable 'interface0' from source: play vars 40074 1727204627.20357: variable 'interface0' from source: play vars 40074 1727204627.20368: variable 'interface1' from source: play vars 40074 1727204627.20427: variable 'interface1' from source: play vars 40074 1727204627.20433: variable 'interface1' from source: play vars 40074 1727204627.20494: variable 'interface1' from source: play vars 40074 1727204627.20550: variable '__network_service_name_default_initscripts' from source: role '' defaults 40074 1727204627.20694: variable '__network_service_name_default_initscripts' from source: role '' defaults 40074 1727204627.20697: variable '__network_packages_default_initscripts' from source: role '' defaults 40074 1727204627.20699: variable '__network_packages_default_initscripts' from source: role '' defaults 40074 1727204627.20976: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 40074 1727204627.21656: variable 'network_connections' from source: task vars 40074 1727204627.21669: variable 'interface0' from source: play vars 40074 1727204627.21746: variable 'interface0' from source: play vars 40074 1727204627.21758: variable 'interface0' from source: play vars 40074 1727204627.21833: variable 'interface0' from source: play vars 40074 1727204627.21854: variable 'interface1' from source: play vars 40074 1727204627.21945: variable 'interface1' from source: play vars 40074 1727204627.21956: variable 'interface1' from source: play vars 40074 1727204627.22009: variable 'interface1' from source: play vars 40074 1727204627.22023: variable 'ansible_distribution' from source: facts 40074 1727204627.22027: variable '__network_rh_distros' from source: role '' defaults 40074 1727204627.22037: variable 'ansible_distribution_major_version' from source: facts 40074 1727204627.22061: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 40074 1727204627.22215: variable 'ansible_distribution' from source: facts 40074 1727204627.22221: variable '__network_rh_distros' from source: role '' defaults 40074 1727204627.22224: variable 'ansible_distribution_major_version' from source: facts 40074 1727204627.22232: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 40074 1727204627.22380: variable 'ansible_distribution' from source: facts 40074 1727204627.22384: variable '__network_rh_distros' from source: role '' defaults 40074 1727204627.22387: variable 'ansible_distribution_major_version' from source: facts 40074 1727204627.22419: variable 'network_provider' from source: set_fact 40074 1727204627.22439: variable 'omit' from source: magic vars 40074 1727204627.22464: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204627.22492: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204627.22509: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204627.22526: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204627.22538: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204627.22565: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204627.22568: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204627.22573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204627.22665: Set connection var ansible_pipelining to False 40074 1727204627.22672: Set connection var ansible_shell_executable to /bin/sh 40074 1727204627.22675: Set connection var ansible_shell_type to sh 40074 1727204627.22677: Set connection var ansible_connection to ssh 40074 1727204627.22686: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204627.22695: Set connection var ansible_timeout to 10 40074 1727204627.22723: variable 'ansible_shell_executable' from source: unknown 40074 1727204627.22726: variable 'ansible_connection' from source: unknown 40074 1727204627.22729: variable 'ansible_module_compression' from source: unknown 40074 1727204627.22732: variable 'ansible_shell_type' from source: unknown 40074 1727204627.22735: variable 'ansible_shell_executable' from source: unknown 40074 1727204627.22737: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204627.22743: variable 'ansible_pipelining' from source: unknown 40074 1727204627.22745: variable 'ansible_timeout' from source: unknown 40074 1727204627.22752: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204627.22844: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204627.22855: variable 'omit' from source: magic vars 40074 1727204627.22861: starting attempt loop 40074 1727204627.22864: running the handler 40074 1727204627.22932: variable 'ansible_facts' from source: unknown 40074 1727204627.23576: _low_level_execute_command(): starting 40074 1727204627.23583: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204627.24528: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204627.24531: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204627.26310: stdout chunk (state=3): >>>/root <<< 40074 1727204627.26413: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204627.26498: stderr chunk (state=3): >>><<< 40074 1727204627.26508: stdout chunk (state=3): >>><<< 40074 1727204627.26537: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204627.26556: _low_level_execute_command(): starting 40074 1727204627.26567: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901 `" && echo ansible-tmp-1727204627.265442-40964-14641513816901="` echo /root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901 `" ) && sleep 0' 40074 1727204627.27229: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204627.27252: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204627.27266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204627.27304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204627.27318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204627.27356: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204627.27427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204627.27445: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204627.27478: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204627.27547: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204627.29586: stdout chunk (state=3): >>>ansible-tmp-1727204627.265442-40964-14641513816901=/root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901 <<< 40074 1727204627.29806: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204627.29810: stdout chunk (state=3): >>><<< 40074 1727204627.29812: stderr chunk (state=3): >>><<< 40074 1727204627.30002: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204627.265442-40964-14641513816901=/root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204627.30011: variable 'ansible_module_compression' from source: unknown 40074 1727204627.30015: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 40074 1727204627.30020: ANSIBALLZ: Acquiring lock 40074 1727204627.30023: ANSIBALLZ: Lock acquired: 139809964199616 40074 1727204627.30026: ANSIBALLZ: Creating module 40074 1727204627.68991: ANSIBALLZ: Writing module into payload 40074 1727204627.69219: ANSIBALLZ: Writing module 40074 1727204627.69261: ANSIBALLZ: Renaming module 40074 1727204627.69273: ANSIBALLZ: Done creating module 40074 1727204627.69304: variable 'ansible_facts' from source: unknown 40074 1727204627.69518: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901/AnsiballZ_systemd.py 40074 1727204627.69804: Sending initial data 40074 1727204627.69814: Sent initial data (154 bytes) 40074 1727204627.70485: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204627.70506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204627.70624: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204627.70812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204627.70893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204627.72738: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204627.72777: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204627.72854: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901/AnsiballZ_systemd.py" <<< 40074 1727204627.72867: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmplncgnxiu /root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901/AnsiballZ_systemd.py <<< 40074 1727204627.72879: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmplncgnxiu" to remote "/root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901/AnsiballZ_systemd.py" <<< 40074 1727204627.75994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204627.76296: stderr chunk (state=3): >>><<< 40074 1727204627.76300: stdout chunk (state=3): >>><<< 40074 1727204627.76302: done transferring module to remote 40074 1727204627.76305: _low_level_execute_command(): starting 40074 1727204627.76307: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901/ /root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901/AnsiballZ_systemd.py && sleep 0' 40074 1727204627.77251: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204627.77270: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204627.77291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204627.77314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204627.77352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204627.77407: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204627.77476: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204627.77498: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204627.77516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204627.77595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204627.79702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204627.79721: stdout chunk (state=3): >>><<< 40074 1727204627.79768: stderr chunk (state=3): >>><<< 40074 1727204627.79794: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204627.80147: _low_level_execute_command(): starting 40074 1727204627.80150: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901/AnsiballZ_systemd.py && sleep 0' 40074 1727204627.80728: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204627.80736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204627.80750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204627.80767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204627.80780: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204627.80791: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204627.80802: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204627.80817: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204627.80835: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204627.80839: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 40074 1727204627.80846: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204627.80857: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204627.80871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204627.80881: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204627.80897: stderr chunk (state=3): >>>debug2: match found <<< 40074 1727204627.80900: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204627.80995: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204627.80999: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204627.81011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204627.81088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204628.14855: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4571136", "MemoryAvailable": "infinity", "CPUUsageNSec": "2372226000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 40074 1727204628.14891: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target network.target network.service cloud-init.service NetworkManager-wait-online.service", "After": "systemd-journald.socket sysinit.target dbus.socket cloud-init-local.service system.slice network-pre.target basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": <<< 40074 1727204628.14901: stdout chunk (state=3): >>>"loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:49 EDT", "StateChangeTimestampMonotonic": "1013574884", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 40074 1727204628.17041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204628.17101: stderr chunk (state=3): >>><<< 40074 1727204628.17105: stdout chunk (state=3): >>><<< 40074 1727204628.17126: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4571136", "MemoryAvailable": "infinity", "CPUUsageNSec": "2372226000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target network.target network.service cloud-init.service NetworkManager-wait-online.service", "After": "systemd-journald.socket sysinit.target dbus.socket cloud-init-local.service system.slice network-pre.target basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:49 EDT", "StateChangeTimestampMonotonic": "1013574884", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204628.17299: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204628.17315: _low_level_execute_command(): starting 40074 1727204628.17321: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204627.265442-40964-14641513816901/ > /dev/null 2>&1 && sleep 0' 40074 1727204628.17782: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204628.17820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204628.17823: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204628.17826: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204628.17877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204628.17881: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204628.17930: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204628.19908: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204628.19957: stderr chunk (state=3): >>><<< 40074 1727204628.19961: stdout chunk (state=3): >>><<< 40074 1727204628.19974: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204628.19983: handler run complete 40074 1727204628.20035: attempt loop complete, returning result 40074 1727204628.20039: _execute() done 40074 1727204628.20042: dumping result to json 40074 1727204628.20062: done dumping result, returning 40074 1727204628.20072: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-9fd7-2501-000000000027] 40074 1727204628.20075: sending task result for task 12b410aa-8751-9fd7-2501-000000000027 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204628.20624: no more pending results, returning what we have 40074 1727204628.20627: results queue empty 40074 1727204628.20628: checking for any_errors_fatal 40074 1727204628.20634: done checking for any_errors_fatal 40074 1727204628.20635: checking for max_fail_percentage 40074 1727204628.20636: done checking for max_fail_percentage 40074 1727204628.20637: checking to see if all hosts have failed and the running result is not ok 40074 1727204628.20638: done checking to see if all hosts have failed 40074 1727204628.20639: getting the remaining hosts for this loop 40074 1727204628.20640: done getting the remaining hosts for this loop 40074 1727204628.20644: getting the next task for host managed-node2 40074 1727204628.20650: done getting next task for host managed-node2 40074 1727204628.20654: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 40074 1727204628.20657: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204628.20667: getting variables 40074 1727204628.20669: in VariableManager get_vars() 40074 1727204628.20700: Calling all_inventory to load vars for managed-node2 40074 1727204628.20702: Calling groups_inventory to load vars for managed-node2 40074 1727204628.20704: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204628.20713: Calling all_plugins_play to load vars for managed-node2 40074 1727204628.20715: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204628.20723: done sending task result for task 12b410aa-8751-9fd7-2501-000000000027 40074 1727204628.20727: WORKER PROCESS EXITING 40074 1727204628.20732: Calling groups_plugins_play to load vars for managed-node2 40074 1727204628.21875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204628.23462: done with get_vars() 40074 1727204628.23485: done getting variables 40074 1727204628.23539: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:48 -0400 (0:00:01.110) 0:00:21.997 ***** 40074 1727204628.23570: entering _queue_task() for managed-node2/service 40074 1727204628.23833: worker is 1 (out of 1 available) 40074 1727204628.23847: exiting _queue_task() for managed-node2/service 40074 1727204628.23861: done queuing things up, now waiting for results queue to drain 40074 1727204628.23862: waiting for pending results... 40074 1727204628.24066: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 40074 1727204628.24182: in run() - task 12b410aa-8751-9fd7-2501-000000000028 40074 1727204628.24196: variable 'ansible_search_path' from source: unknown 40074 1727204628.24199: variable 'ansible_search_path' from source: unknown 40074 1727204628.24238: calling self._execute() 40074 1727204628.24324: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204628.24329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204628.24339: variable 'omit' from source: magic vars 40074 1727204628.24669: variable 'ansible_distribution_major_version' from source: facts 40074 1727204628.24680: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204628.24783: variable 'network_provider' from source: set_fact 40074 1727204628.24787: Evaluated conditional (network_provider == "nm"): True 40074 1727204628.24871: variable '__network_wpa_supplicant_required' from source: role '' defaults 40074 1727204628.24948: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 40074 1727204628.25107: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204628.26910: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204628.26967: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204628.27001: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204628.27039: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204628.27058: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204628.27131: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204628.27169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204628.27192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204628.27227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204628.27242: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204628.27284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204628.27307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204628.27332: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204628.27365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204628.27380: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204628.27417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204628.27439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204628.27460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204628.27497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204628.27511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204628.27629: variable 'network_connections' from source: task vars 40074 1727204628.27640: variable 'interface0' from source: play vars 40074 1727204628.27703: variable 'interface0' from source: play vars 40074 1727204628.27712: variable 'interface0' from source: play vars 40074 1727204628.27764: variable 'interface0' from source: play vars 40074 1727204628.27776: variable 'interface1' from source: play vars 40074 1727204628.27833: variable 'interface1' from source: play vars 40074 1727204628.27840: variable 'interface1' from source: play vars 40074 1727204628.27892: variable 'interface1' from source: play vars 40074 1727204628.27961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204628.28097: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204628.28137: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204628.28164: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204628.28191: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204628.28232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204628.28253: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204628.28273: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204628.28296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204628.28341: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204628.28559: variable 'network_connections' from source: task vars 40074 1727204628.28564: variable 'interface0' from source: play vars 40074 1727204628.28617: variable 'interface0' from source: play vars 40074 1727204628.28626: variable 'interface0' from source: play vars 40074 1727204628.28683: variable 'interface0' from source: play vars 40074 1727204628.28697: variable 'interface1' from source: play vars 40074 1727204628.28749: variable 'interface1' from source: play vars 40074 1727204628.28756: variable 'interface1' from source: play vars 40074 1727204628.28811: variable 'interface1' from source: play vars 40074 1727204628.28851: Evaluated conditional (__network_wpa_supplicant_required): False 40074 1727204628.28855: when evaluation is False, skipping this task 40074 1727204628.28857: _execute() done 40074 1727204628.28863: dumping result to json 40074 1727204628.28866: done dumping result, returning 40074 1727204628.28874: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-9fd7-2501-000000000028] 40074 1727204628.28884: sending task result for task 12b410aa-8751-9fd7-2501-000000000028 40074 1727204628.28977: done sending task result for task 12b410aa-8751-9fd7-2501-000000000028 40074 1727204628.28980: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 40074 1727204628.29038: no more pending results, returning what we have 40074 1727204628.29042: results queue empty 40074 1727204628.29043: checking for any_errors_fatal 40074 1727204628.29072: done checking for any_errors_fatal 40074 1727204628.29073: checking for max_fail_percentage 40074 1727204628.29075: done checking for max_fail_percentage 40074 1727204628.29076: checking to see if all hosts have failed and the running result is not ok 40074 1727204628.29078: done checking to see if all hosts have failed 40074 1727204628.29079: getting the remaining hosts for this loop 40074 1727204628.29080: done getting the remaining hosts for this loop 40074 1727204628.29085: getting the next task for host managed-node2 40074 1727204628.29094: done getting next task for host managed-node2 40074 1727204628.29099: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 40074 1727204628.29102: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204628.29118: getting variables 40074 1727204628.29120: in VariableManager get_vars() 40074 1727204628.29169: Calling all_inventory to load vars for managed-node2 40074 1727204628.29173: Calling groups_inventory to load vars for managed-node2 40074 1727204628.29176: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204628.29186: Calling all_plugins_play to load vars for managed-node2 40074 1727204628.29227: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204628.29235: Calling groups_plugins_play to load vars for managed-node2 40074 1727204628.30594: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204628.32790: done with get_vars() 40074 1727204628.32816: done getting variables 40074 1727204628.32871: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:48 -0400 (0:00:00.093) 0:00:22.090 ***** 40074 1727204628.32902: entering _queue_task() for managed-node2/service 40074 1727204628.33165: worker is 1 (out of 1 available) 40074 1727204628.33181: exiting _queue_task() for managed-node2/service 40074 1727204628.33197: done queuing things up, now waiting for results queue to drain 40074 1727204628.33199: waiting for pending results... 40074 1727204628.33395: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 40074 1727204628.33503: in run() - task 12b410aa-8751-9fd7-2501-000000000029 40074 1727204628.33515: variable 'ansible_search_path' from source: unknown 40074 1727204628.33519: variable 'ansible_search_path' from source: unknown 40074 1727204628.33560: calling self._execute() 40074 1727204628.33649: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204628.33654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204628.33663: variable 'omit' from source: magic vars 40074 1727204628.34295: variable 'ansible_distribution_major_version' from source: facts 40074 1727204628.34299: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204628.34371: variable 'network_provider' from source: set_fact 40074 1727204628.34384: Evaluated conditional (network_provider == "initscripts"): False 40074 1727204628.34395: when evaluation is False, skipping this task 40074 1727204628.34404: _execute() done 40074 1727204628.34414: dumping result to json 40074 1727204628.34426: done dumping result, returning 40074 1727204628.34440: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-9fd7-2501-000000000029] 40074 1727204628.34451: sending task result for task 12b410aa-8751-9fd7-2501-000000000029 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204628.34614: no more pending results, returning what we have 40074 1727204628.34619: results queue empty 40074 1727204628.34620: checking for any_errors_fatal 40074 1727204628.34631: done checking for any_errors_fatal 40074 1727204628.34632: checking for max_fail_percentage 40074 1727204628.34633: done checking for max_fail_percentage 40074 1727204628.34634: checking to see if all hosts have failed and the running result is not ok 40074 1727204628.34636: done checking to see if all hosts have failed 40074 1727204628.34637: getting the remaining hosts for this loop 40074 1727204628.34638: done getting the remaining hosts for this loop 40074 1727204628.34642: getting the next task for host managed-node2 40074 1727204628.34649: done getting next task for host managed-node2 40074 1727204628.34653: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 40074 1727204628.34657: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204628.34676: getting variables 40074 1727204628.34678: in VariableManager get_vars() 40074 1727204628.34723: Calling all_inventory to load vars for managed-node2 40074 1727204628.34727: Calling groups_inventory to load vars for managed-node2 40074 1727204628.34729: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204628.34742: Calling all_plugins_play to load vars for managed-node2 40074 1727204628.34745: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204628.34750: Calling groups_plugins_play to load vars for managed-node2 40074 1727204628.35306: done sending task result for task 12b410aa-8751-9fd7-2501-000000000029 40074 1727204628.35310: WORKER PROCESS EXITING 40074 1727204628.37281: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204628.40307: done with get_vars() 40074 1727204628.40354: done getting variables 40074 1727204628.40443: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:48 -0400 (0:00:00.075) 0:00:22.166 ***** 40074 1727204628.40485: entering _queue_task() for managed-node2/copy 40074 1727204628.41098: worker is 1 (out of 1 available) 40074 1727204628.41110: exiting _queue_task() for managed-node2/copy 40074 1727204628.41125: done queuing things up, now waiting for results queue to drain 40074 1727204628.41126: waiting for pending results... 40074 1727204628.41258: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 40074 1727204628.41461: in run() - task 12b410aa-8751-9fd7-2501-00000000002a 40074 1727204628.41466: variable 'ansible_search_path' from source: unknown 40074 1727204628.41468: variable 'ansible_search_path' from source: unknown 40074 1727204628.41509: calling self._execute() 40074 1727204628.41677: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204628.41681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204628.41684: variable 'omit' from source: magic vars 40074 1727204628.42110: variable 'ansible_distribution_major_version' from source: facts 40074 1727204628.42136: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204628.42297: variable 'network_provider' from source: set_fact 40074 1727204628.42309: Evaluated conditional (network_provider == "initscripts"): False 40074 1727204628.42320: when evaluation is False, skipping this task 40074 1727204628.42331: _execute() done 40074 1727204628.42343: dumping result to json 40074 1727204628.42394: done dumping result, returning 40074 1727204628.42398: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-9fd7-2501-00000000002a] 40074 1727204628.42401: sending task result for task 12b410aa-8751-9fd7-2501-00000000002a 40074 1727204628.42644: done sending task result for task 12b410aa-8751-9fd7-2501-00000000002a 40074 1727204628.42648: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 40074 1727204628.42705: no more pending results, returning what we have 40074 1727204628.42710: results queue empty 40074 1727204628.42711: checking for any_errors_fatal 40074 1727204628.42726: done checking for any_errors_fatal 40074 1727204628.42727: checking for max_fail_percentage 40074 1727204628.42729: done checking for max_fail_percentage 40074 1727204628.42730: checking to see if all hosts have failed and the running result is not ok 40074 1727204628.42732: done checking to see if all hosts have failed 40074 1727204628.42733: getting the remaining hosts for this loop 40074 1727204628.42735: done getting the remaining hosts for this loop 40074 1727204628.42739: getting the next task for host managed-node2 40074 1727204628.42747: done getting next task for host managed-node2 40074 1727204628.42754: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 40074 1727204628.42758: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204628.42776: getting variables 40074 1727204628.42778: in VariableManager get_vars() 40074 1727204628.42831: Calling all_inventory to load vars for managed-node2 40074 1727204628.42835: Calling groups_inventory to load vars for managed-node2 40074 1727204628.42838: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204628.42852: Calling all_plugins_play to load vars for managed-node2 40074 1727204628.42856: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204628.42861: Calling groups_plugins_play to load vars for managed-node2 40074 1727204628.45242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204628.48476: done with get_vars() 40074 1727204628.48522: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:48 -0400 (0:00:00.081) 0:00:22.248 ***** 40074 1727204628.48638: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 40074 1727204628.48641: Creating lock for fedora.linux_system_roles.network_connections 40074 1727204628.49064: worker is 1 (out of 1 available) 40074 1727204628.49080: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 40074 1727204628.49099: done queuing things up, now waiting for results queue to drain 40074 1727204628.49100: waiting for pending results... 40074 1727204628.49393: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 40074 1727204628.49549: in run() - task 12b410aa-8751-9fd7-2501-00000000002b 40074 1727204628.49573: variable 'ansible_search_path' from source: unknown 40074 1727204628.49581: variable 'ansible_search_path' from source: unknown 40074 1727204628.49628: calling self._execute() 40074 1727204628.49738: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204628.49796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204628.49799: variable 'omit' from source: magic vars 40074 1727204628.50221: variable 'ansible_distribution_major_version' from source: facts 40074 1727204628.50243: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204628.50257: variable 'omit' from source: magic vars 40074 1727204628.50344: variable 'omit' from source: magic vars 40074 1727204628.50593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204628.53997: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204628.54001: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204628.54005: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204628.54008: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204628.54011: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204628.54099: variable 'network_provider' from source: set_fact 40074 1727204628.54271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204628.54313: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204628.54354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204628.54412: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204628.54439: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204628.54535: variable 'omit' from source: magic vars 40074 1727204628.54683: variable 'omit' from source: magic vars 40074 1727204628.54828: variable 'network_connections' from source: task vars 40074 1727204628.54849: variable 'interface0' from source: play vars 40074 1727204628.54939: variable 'interface0' from source: play vars 40074 1727204628.54953: variable 'interface0' from source: play vars 40074 1727204628.55033: variable 'interface0' from source: play vars 40074 1727204628.55055: variable 'interface1' from source: play vars 40074 1727204628.55143: variable 'interface1' from source: play vars 40074 1727204628.55158: variable 'interface1' from source: play vars 40074 1727204628.55237: variable 'interface1' from source: play vars 40074 1727204628.55546: variable 'omit' from source: magic vars 40074 1727204628.55562: variable '__lsr_ansible_managed' from source: task vars 40074 1727204628.55644: variable '__lsr_ansible_managed' from source: task vars 40074 1727204628.55993: Loaded config def from plugin (lookup/template) 40074 1727204628.56005: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 40074 1727204628.56045: File lookup term: get_ansible_managed.j2 40074 1727204628.56053: variable 'ansible_search_path' from source: unknown 40074 1727204628.56063: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 40074 1727204628.56082: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 40074 1727204628.56109: variable 'ansible_search_path' from source: unknown 40074 1727204628.66622: variable 'ansible_managed' from source: unknown 40074 1727204628.66766: variable 'omit' from source: magic vars 40074 1727204628.66792: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204628.66820: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204628.66836: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204628.66854: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204628.66864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204628.66892: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204628.66895: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204628.66900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204628.66986: Set connection var ansible_pipelining to False 40074 1727204628.66994: Set connection var ansible_shell_executable to /bin/sh 40074 1727204628.66998: Set connection var ansible_shell_type to sh 40074 1727204628.67001: Set connection var ansible_connection to ssh 40074 1727204628.67008: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204628.67015: Set connection var ansible_timeout to 10 40074 1727204628.67039: variable 'ansible_shell_executable' from source: unknown 40074 1727204628.67043: variable 'ansible_connection' from source: unknown 40074 1727204628.67045: variable 'ansible_module_compression' from source: unknown 40074 1727204628.67048: variable 'ansible_shell_type' from source: unknown 40074 1727204628.67055: variable 'ansible_shell_executable' from source: unknown 40074 1727204628.67058: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204628.67061: variable 'ansible_pipelining' from source: unknown 40074 1727204628.67064: variable 'ansible_timeout' from source: unknown 40074 1727204628.67073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204628.67184: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204628.67197: variable 'omit' from source: magic vars 40074 1727204628.67206: starting attempt loop 40074 1727204628.67209: running the handler 40074 1727204628.67224: _low_level_execute_command(): starting 40074 1727204628.67231: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204628.67767: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204628.67771: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204628.67774: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204628.67776: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204628.67832: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204628.67835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204628.67888: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204628.69673: stdout chunk (state=3): >>>/root <<< 40074 1727204628.69782: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204628.69842: stderr chunk (state=3): >>><<< 40074 1727204628.69845: stdout chunk (state=3): >>><<< 40074 1727204628.69867: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204628.69879: _low_level_execute_command(): starting 40074 1727204628.69885: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803 `" && echo ansible-tmp-1727204628.698679-40996-80450650781803="` echo /root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803 `" ) && sleep 0' 40074 1727204628.70357: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204628.70360: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204628.70365: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204628.70368: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204628.70419: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204628.70426: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204628.70470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204628.72531: stdout chunk (state=3): >>>ansible-tmp-1727204628.698679-40996-80450650781803=/root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803 <<< 40074 1727204628.72649: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204628.72706: stderr chunk (state=3): >>><<< 40074 1727204628.72709: stdout chunk (state=3): >>><<< 40074 1727204628.72724: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204628.698679-40996-80450650781803=/root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204628.72788: variable 'ansible_module_compression' from source: unknown 40074 1727204628.72818: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 40074 1727204628.72823: ANSIBALLZ: Acquiring lock 40074 1727204628.72826: ANSIBALLZ: Lock acquired: 139809958431488 40074 1727204628.72832: ANSIBALLZ: Creating module 40074 1727204628.97262: ANSIBALLZ: Writing module into payload 40074 1727204628.97599: ANSIBALLZ: Writing module 40074 1727204628.97625: ANSIBALLZ: Renaming module 40074 1727204628.97629: ANSIBALLZ: Done creating module 40074 1727204628.97653: variable 'ansible_facts' from source: unknown 40074 1727204628.97723: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803/AnsiballZ_network_connections.py 40074 1727204628.97840: Sending initial data 40074 1727204628.97843: Sent initial data (166 bytes) 40074 1727204628.98384: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204628.98388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204628.98394: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204628.98406: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204628.98427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204628.98478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204629.00281: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204629.00285: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204629.00345: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpwuxyxamt /root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803/AnsiballZ_network_connections.py <<< 40074 1727204629.00350: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803/AnsiballZ_network_connections.py" <<< 40074 1727204629.00401: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpwuxyxamt" to remote "/root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803/AnsiballZ_network_connections.py" <<< 40074 1727204629.01853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204629.01997: stderr chunk (state=3): >>><<< 40074 1727204629.02001: stdout chunk (state=3): >>><<< 40074 1727204629.02003: done transferring module to remote 40074 1727204629.02006: _low_level_execute_command(): starting 40074 1727204629.02008: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803/ /root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803/AnsiballZ_network_connections.py && sleep 0' 40074 1727204629.02466: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204629.02470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204629.02472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204629.02475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204629.02477: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204629.02531: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204629.02535: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204629.02582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204629.04724: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204629.04729: stdout chunk (state=3): >>><<< 40074 1727204629.04731: stderr chunk (state=3): >>><<< 40074 1727204629.04734: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204629.04736: _low_level_execute_command(): starting 40074 1727204629.04738: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803/AnsiballZ_network_connections.py && sleep 0' 40074 1727204629.05337: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204629.05353: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204629.05369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204629.05388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204629.05419: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204629.05445: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204629.05505: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204629.05573: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204629.05588: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204629.05620: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204629.05698: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204629.45544: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 00af1773-a047-48e6-9537-86cd5f38b3ec\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 00af1773-a047-48e6-9537-86cd5f38b3ec (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 40074 1727204629.47692: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204629.47748: stderr chunk (state=3): >>><<< 40074 1727204629.47752: stdout chunk (state=3): >>><<< 40074 1727204629.47769: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 00af1773-a047-48e6-9537-86cd5f38b3ec\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 00af1773-a047-48e6-9537-86cd5f38b3ec (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "interface_name": "ethtest0", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.3/24", "2001:db8::2/32"], "route": [{"network": "198.51.10.64", "prefix": 26, "gateway": "198.51.100.6", "metric": 4}, {"network": "2001:db6::4", "prefix": 128, "gateway": "2001:db8::1", "metric": 2}]}}, {"name": "ethtest1", "interface_name": "ethtest1", "state": "up", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.6/24", "2001:db8::4/32"], "route": [{"network": "198.51.12.128", "prefix": 26, "gateway": "198.51.100.1", "metric": 2}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204629.47847: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'interface_name': 'ethtest0', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.3/24', '2001:db8::2/32'], 'route': [{'network': '198.51.10.64', 'prefix': 26, 'gateway': '198.51.100.6', 'metric': 4}, {'network': '2001:db6::4', 'prefix': 128, 'gateway': '2001:db8::1', 'metric': 2}]}}, {'name': 'ethtest1', 'interface_name': 'ethtest1', 'state': 'up', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.6/24', '2001:db8::4/32'], 'route': [{'network': '198.51.12.128', 'prefix': 26, 'gateway': '198.51.100.1', 'metric': 2}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204629.47856: _low_level_execute_command(): starting 40074 1727204629.47862: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204628.698679-40996-80450650781803/ > /dev/null 2>&1 && sleep 0' 40074 1727204629.48375: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204629.48383: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204629.48386: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204629.48388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204629.48444: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204629.48453: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204629.48455: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204629.48497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204629.50432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204629.50481: stderr chunk (state=3): >>><<< 40074 1727204629.50485: stdout chunk (state=3): >>><<< 40074 1727204629.50501: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204629.50510: handler run complete 40074 1727204629.50570: attempt loop complete, returning result 40074 1727204629.50574: _execute() done 40074 1727204629.50577: dumping result to json 40074 1727204629.50585: done dumping result, returning 40074 1727204629.50596: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-9fd7-2501-00000000002b] 40074 1727204629.50601: sending task result for task 12b410aa-8751-9fd7-2501-00000000002b 40074 1727204629.50739: done sending task result for task 12b410aa-8751-9fd7-2501-00000000002b 40074 1727204629.50742: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 00af1773-a047-48e6-9537-86cd5f38b3ec [006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07 [007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 00af1773-a047-48e6-9537-86cd5f38b3ec (not-active) [008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07 (not-active) 40074 1727204629.51009: no more pending results, returning what we have 40074 1727204629.51013: results queue empty 40074 1727204629.51014: checking for any_errors_fatal 40074 1727204629.51020: done checking for any_errors_fatal 40074 1727204629.51021: checking for max_fail_percentage 40074 1727204629.51023: done checking for max_fail_percentage 40074 1727204629.51024: checking to see if all hosts have failed and the running result is not ok 40074 1727204629.51025: done checking to see if all hosts have failed 40074 1727204629.51026: getting the remaining hosts for this loop 40074 1727204629.51028: done getting the remaining hosts for this loop 40074 1727204629.51032: getting the next task for host managed-node2 40074 1727204629.51038: done getting next task for host managed-node2 40074 1727204629.51042: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 40074 1727204629.51046: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204629.51058: getting variables 40074 1727204629.51059: in VariableManager get_vars() 40074 1727204629.51109: Calling all_inventory to load vars for managed-node2 40074 1727204629.51112: Calling groups_inventory to load vars for managed-node2 40074 1727204629.51115: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204629.51126: Calling all_plugins_play to load vars for managed-node2 40074 1727204629.51129: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204629.51137: Calling groups_plugins_play to load vars for managed-node2 40074 1727204629.52518: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204629.54133: done with get_vars() 40074 1727204629.54157: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:49 -0400 (0:00:01.055) 0:00:23.304 ***** 40074 1727204629.54236: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 40074 1727204629.54238: Creating lock for fedora.linux_system_roles.network_state 40074 1727204629.54519: worker is 1 (out of 1 available) 40074 1727204629.54534: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 40074 1727204629.54549: done queuing things up, now waiting for results queue to drain 40074 1727204629.54551: waiting for pending results... 40074 1727204629.54755: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 40074 1727204629.54867: in run() - task 12b410aa-8751-9fd7-2501-00000000002c 40074 1727204629.54884: variable 'ansible_search_path' from source: unknown 40074 1727204629.54888: variable 'ansible_search_path' from source: unknown 40074 1727204629.54928: calling self._execute() 40074 1727204629.55012: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204629.55019: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204629.55034: variable 'omit' from source: magic vars 40074 1727204629.55365: variable 'ansible_distribution_major_version' from source: facts 40074 1727204629.55376: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204629.55484: variable 'network_state' from source: role '' defaults 40074 1727204629.55495: Evaluated conditional (network_state != {}): False 40074 1727204629.55499: when evaluation is False, skipping this task 40074 1727204629.55502: _execute() done 40074 1727204629.55509: dumping result to json 40074 1727204629.55511: done dumping result, returning 40074 1727204629.55520: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-9fd7-2501-00000000002c] 40074 1727204629.55527: sending task result for task 12b410aa-8751-9fd7-2501-00000000002c 40074 1727204629.55620: done sending task result for task 12b410aa-8751-9fd7-2501-00000000002c 40074 1727204629.55624: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204629.55683: no more pending results, returning what we have 40074 1727204629.55688: results queue empty 40074 1727204629.55691: checking for any_errors_fatal 40074 1727204629.55714: done checking for any_errors_fatal 40074 1727204629.55715: checking for max_fail_percentage 40074 1727204629.55717: done checking for max_fail_percentage 40074 1727204629.55718: checking to see if all hosts have failed and the running result is not ok 40074 1727204629.55719: done checking to see if all hosts have failed 40074 1727204629.55720: getting the remaining hosts for this loop 40074 1727204629.55722: done getting the remaining hosts for this loop 40074 1727204629.55726: getting the next task for host managed-node2 40074 1727204629.55735: done getting next task for host managed-node2 40074 1727204629.55740: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 40074 1727204629.55743: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204629.55760: getting variables 40074 1727204629.55762: in VariableManager get_vars() 40074 1727204629.55811: Calling all_inventory to load vars for managed-node2 40074 1727204629.55814: Calling groups_inventory to load vars for managed-node2 40074 1727204629.55817: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204629.55827: Calling all_plugins_play to load vars for managed-node2 40074 1727204629.55830: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204629.55833: Calling groups_plugins_play to load vars for managed-node2 40074 1727204629.57085: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204629.58793: done with get_vars() 40074 1727204629.58816: done getting variables 40074 1727204629.58872: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:49 -0400 (0:00:00.046) 0:00:23.350 ***** 40074 1727204629.58903: entering _queue_task() for managed-node2/debug 40074 1727204629.59174: worker is 1 (out of 1 available) 40074 1727204629.59188: exiting _queue_task() for managed-node2/debug 40074 1727204629.59205: done queuing things up, now waiting for results queue to drain 40074 1727204629.59206: waiting for pending results... 40074 1727204629.59407: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 40074 1727204629.59508: in run() - task 12b410aa-8751-9fd7-2501-00000000002d 40074 1727204629.59523: variable 'ansible_search_path' from source: unknown 40074 1727204629.59527: variable 'ansible_search_path' from source: unknown 40074 1727204629.59563: calling self._execute() 40074 1727204629.59645: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204629.59651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204629.59664: variable 'omit' from source: magic vars 40074 1727204629.59984: variable 'ansible_distribution_major_version' from source: facts 40074 1727204629.59997: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204629.60003: variable 'omit' from source: magic vars 40074 1727204629.60054: variable 'omit' from source: magic vars 40074 1727204629.60083: variable 'omit' from source: magic vars 40074 1727204629.60126: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204629.60158: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204629.60176: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204629.60193: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204629.60206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204629.60236: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204629.60240: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204629.60243: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204629.60332: Set connection var ansible_pipelining to False 40074 1727204629.60339: Set connection var ansible_shell_executable to /bin/sh 40074 1727204629.60342: Set connection var ansible_shell_type to sh 40074 1727204629.60345: Set connection var ansible_connection to ssh 40074 1727204629.60354: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204629.60360: Set connection var ansible_timeout to 10 40074 1727204629.60382: variable 'ansible_shell_executable' from source: unknown 40074 1727204629.60386: variable 'ansible_connection' from source: unknown 40074 1727204629.60390: variable 'ansible_module_compression' from source: unknown 40074 1727204629.60393: variable 'ansible_shell_type' from source: unknown 40074 1727204629.60398: variable 'ansible_shell_executable' from source: unknown 40074 1727204629.60401: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204629.60407: variable 'ansible_pipelining' from source: unknown 40074 1727204629.60409: variable 'ansible_timeout' from source: unknown 40074 1727204629.60415: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204629.60536: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204629.60549: variable 'omit' from source: magic vars 40074 1727204629.60554: starting attempt loop 40074 1727204629.60558: running the handler 40074 1727204629.60669: variable '__network_connections_result' from source: set_fact 40074 1727204629.60726: handler run complete 40074 1727204629.60742: attempt loop complete, returning result 40074 1727204629.60745: _execute() done 40074 1727204629.60751: dumping result to json 40074 1727204629.60758: done dumping result, returning 40074 1727204629.60767: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-9fd7-2501-00000000002d] 40074 1727204629.60770: sending task result for task 12b410aa-8751-9fd7-2501-00000000002d 40074 1727204629.60864: done sending task result for task 12b410aa-8751-9fd7-2501-00000000002d 40074 1727204629.60868: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 00af1773-a047-48e6-9537-86cd5f38b3ec", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 00af1773-a047-48e6-9537-86cd5f38b3ec (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07 (not-active)" ] } 40074 1727204629.60949: no more pending results, returning what we have 40074 1727204629.60953: results queue empty 40074 1727204629.60954: checking for any_errors_fatal 40074 1727204629.60960: done checking for any_errors_fatal 40074 1727204629.60960: checking for max_fail_percentage 40074 1727204629.60962: done checking for max_fail_percentage 40074 1727204629.60963: checking to see if all hosts have failed and the running result is not ok 40074 1727204629.60965: done checking to see if all hosts have failed 40074 1727204629.60966: getting the remaining hosts for this loop 40074 1727204629.60967: done getting the remaining hosts for this loop 40074 1727204629.60977: getting the next task for host managed-node2 40074 1727204629.60984: done getting next task for host managed-node2 40074 1727204629.60988: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 40074 1727204629.60993: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204629.61005: getting variables 40074 1727204629.61006: in VariableManager get_vars() 40074 1727204629.61046: Calling all_inventory to load vars for managed-node2 40074 1727204629.61049: Calling groups_inventory to load vars for managed-node2 40074 1727204629.61052: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204629.61061: Calling all_plugins_play to load vars for managed-node2 40074 1727204629.61064: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204629.61068: Calling groups_plugins_play to load vars for managed-node2 40074 1727204629.62315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204629.63929: done with get_vars() 40074 1727204629.63956: done getting variables 40074 1727204629.64010: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:49 -0400 (0:00:00.051) 0:00:23.402 ***** 40074 1727204629.64046: entering _queue_task() for managed-node2/debug 40074 1727204629.64320: worker is 1 (out of 1 available) 40074 1727204629.64335: exiting _queue_task() for managed-node2/debug 40074 1727204629.64350: done queuing things up, now waiting for results queue to drain 40074 1727204629.64352: waiting for pending results... 40074 1727204629.64547: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 40074 1727204629.64657: in run() - task 12b410aa-8751-9fd7-2501-00000000002e 40074 1727204629.64669: variable 'ansible_search_path' from source: unknown 40074 1727204629.64674: variable 'ansible_search_path' from source: unknown 40074 1727204629.64710: calling self._execute() 40074 1727204629.64794: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204629.64800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204629.64813: variable 'omit' from source: magic vars 40074 1727204629.65155: variable 'ansible_distribution_major_version' from source: facts 40074 1727204629.65159: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204629.65162: variable 'omit' from source: magic vars 40074 1727204629.65209: variable 'omit' from source: magic vars 40074 1727204629.65242: variable 'omit' from source: magic vars 40074 1727204629.65279: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204629.65312: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204629.65331: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204629.65348: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204629.65363: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204629.65393: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204629.65397: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204629.65402: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204629.65492: Set connection var ansible_pipelining to False 40074 1727204629.65499: Set connection var ansible_shell_executable to /bin/sh 40074 1727204629.65502: Set connection var ansible_shell_type to sh 40074 1727204629.65505: Set connection var ansible_connection to ssh 40074 1727204629.65512: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204629.65521: Set connection var ansible_timeout to 10 40074 1727204629.65542: variable 'ansible_shell_executable' from source: unknown 40074 1727204629.65545: variable 'ansible_connection' from source: unknown 40074 1727204629.65548: variable 'ansible_module_compression' from source: unknown 40074 1727204629.65551: variable 'ansible_shell_type' from source: unknown 40074 1727204629.65556: variable 'ansible_shell_executable' from source: unknown 40074 1727204629.65559: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204629.65565: variable 'ansible_pipelining' from source: unknown 40074 1727204629.65568: variable 'ansible_timeout' from source: unknown 40074 1727204629.65575: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204629.65709: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204629.65723: variable 'omit' from source: magic vars 40074 1727204629.65730: starting attempt loop 40074 1727204629.65733: running the handler 40074 1727204629.65774: variable '__network_connections_result' from source: set_fact 40074 1727204629.65847: variable '__network_connections_result' from source: set_fact 40074 1727204629.66030: handler run complete 40074 1727204629.66065: attempt loop complete, returning result 40074 1727204629.66068: _execute() done 40074 1727204629.66071: dumping result to json 40074 1727204629.66079: done dumping result, returning 40074 1727204629.66087: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-9fd7-2501-00000000002e] 40074 1727204629.66094: sending task result for task 12b410aa-8751-9fd7-2501-00000000002e 40074 1727204629.66210: done sending task result for task 12b410aa-8751-9fd7-2501-00000000002e 40074 1727204629.66213: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "interface_name": "ethtest0", "ip": { "address": [ "198.51.100.3/24", "2001:db8::2/32" ], "route": [ { "gateway": "198.51.100.6", "metric": 4, "network": "198.51.10.64", "prefix": 26 }, { "gateway": "2001:db8::1", "metric": 2, "network": "2001:db6::4", "prefix": 128 } ] }, "name": "ethtest0", "state": "up", "type": "ethernet" }, { "autoconnect": false, "interface_name": "ethtest1", "ip": { "address": [ "198.51.100.6/24", "2001:db8::4/32" ], "route": [ { "gateway": "198.51.100.1", "metric": 2, "network": "198.51.12.128", "prefix": 26 } ] }, "name": "ethtest1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 00af1773-a047-48e6-9537-86cd5f38b3ec\n[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07\n[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 00af1773-a047-48e6-9537-86cd5f38b3ec (not-active)\n[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07 (not-active)\n", "stderr_lines": [ "[005] #0, state:up persistent_state:present, 'ethtest0': add connection ethtest0, 00af1773-a047-48e6-9537-86cd5f38b3ec", "[006] #1, state:up persistent_state:present, 'ethtest1': add connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07", "[007] #0, state:up persistent_state:present, 'ethtest0': up connection ethtest0, 00af1773-a047-48e6-9537-86cd5f38b3ec (not-active)", "[008] #1, state:up persistent_state:present, 'ethtest1': up connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07 (not-active)" ] } } 40074 1727204629.66381: no more pending results, returning what we have 40074 1727204629.66384: results queue empty 40074 1727204629.66385: checking for any_errors_fatal 40074 1727204629.66398: done checking for any_errors_fatal 40074 1727204629.66399: checking for max_fail_percentage 40074 1727204629.66401: done checking for max_fail_percentage 40074 1727204629.66402: checking to see if all hosts have failed and the running result is not ok 40074 1727204629.66403: done checking to see if all hosts have failed 40074 1727204629.66404: getting the remaining hosts for this loop 40074 1727204629.66405: done getting the remaining hosts for this loop 40074 1727204629.66409: getting the next task for host managed-node2 40074 1727204629.66414: done getting next task for host managed-node2 40074 1727204629.66419: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 40074 1727204629.66422: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204629.66434: getting variables 40074 1727204629.66436: in VariableManager get_vars() 40074 1727204629.66471: Calling all_inventory to load vars for managed-node2 40074 1727204629.66474: Calling groups_inventory to load vars for managed-node2 40074 1727204629.66475: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204629.66483: Calling all_plugins_play to load vars for managed-node2 40074 1727204629.66485: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204629.66487: Calling groups_plugins_play to load vars for managed-node2 40074 1727204629.72852: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204629.75896: done with get_vars() 40074 1727204629.75940: done getting variables 40074 1727204629.76008: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:49 -0400 (0:00:00.119) 0:00:23.522 ***** 40074 1727204629.76043: entering _queue_task() for managed-node2/debug 40074 1727204629.76424: worker is 1 (out of 1 available) 40074 1727204629.76440: exiting _queue_task() for managed-node2/debug 40074 1727204629.76456: done queuing things up, now waiting for results queue to drain 40074 1727204629.76459: waiting for pending results... 40074 1727204629.76761: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 40074 1727204629.76998: in run() - task 12b410aa-8751-9fd7-2501-00000000002f 40074 1727204629.77003: variable 'ansible_search_path' from source: unknown 40074 1727204629.77005: variable 'ansible_search_path' from source: unknown 40074 1727204629.77033: calling self._execute() 40074 1727204629.77291: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204629.77297: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204629.77300: variable 'omit' from source: magic vars 40074 1727204629.77714: variable 'ansible_distribution_major_version' from source: facts 40074 1727204629.77727: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204629.77894: variable 'network_state' from source: role '' defaults 40074 1727204629.77909: Evaluated conditional (network_state != {}): False 40074 1727204629.77913: when evaluation is False, skipping this task 40074 1727204629.77944: _execute() done 40074 1727204629.77948: dumping result to json 40074 1727204629.77951: done dumping result, returning 40074 1727204629.77955: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-9fd7-2501-00000000002f] 40074 1727204629.77957: sending task result for task 12b410aa-8751-9fd7-2501-00000000002f 40074 1727204629.78170: done sending task result for task 12b410aa-8751-9fd7-2501-00000000002f 40074 1727204629.78173: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 40074 1727204629.78224: no more pending results, returning what we have 40074 1727204629.78228: results queue empty 40074 1727204629.78229: checking for any_errors_fatal 40074 1727204629.78239: done checking for any_errors_fatal 40074 1727204629.78240: checking for max_fail_percentage 40074 1727204629.78241: done checking for max_fail_percentage 40074 1727204629.78243: checking to see if all hosts have failed and the running result is not ok 40074 1727204629.78244: done checking to see if all hosts have failed 40074 1727204629.78245: getting the remaining hosts for this loop 40074 1727204629.78246: done getting the remaining hosts for this loop 40074 1727204629.78249: getting the next task for host managed-node2 40074 1727204629.78255: done getting next task for host managed-node2 40074 1727204629.78260: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 40074 1727204629.78337: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204629.78362: getting variables 40074 1727204629.78365: in VariableManager get_vars() 40074 1727204629.78414: Calling all_inventory to load vars for managed-node2 40074 1727204629.78417: Calling groups_inventory to load vars for managed-node2 40074 1727204629.78420: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204629.78432: Calling all_plugins_play to load vars for managed-node2 40074 1727204629.78436: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204629.78440: Calling groups_plugins_play to load vars for managed-node2 40074 1727204629.80716: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204629.84149: done with get_vars() 40074 1727204629.84184: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:49 -0400 (0:00:00.082) 0:00:23.604 ***** 40074 1727204629.84324: entering _queue_task() for managed-node2/ping 40074 1727204629.84326: Creating lock for ping 40074 1727204629.84925: worker is 1 (out of 1 available) 40074 1727204629.84937: exiting _queue_task() for managed-node2/ping 40074 1727204629.84949: done queuing things up, now waiting for results queue to drain 40074 1727204629.84950: waiting for pending results... 40074 1727204629.85126: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 40074 1727204629.85335: in run() - task 12b410aa-8751-9fd7-2501-000000000030 40074 1727204629.85340: variable 'ansible_search_path' from source: unknown 40074 1727204629.85343: variable 'ansible_search_path' from source: unknown 40074 1727204629.85346: calling self._execute() 40074 1727204629.85436: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204629.85508: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204629.85513: variable 'omit' from source: magic vars 40074 1727204629.86206: variable 'ansible_distribution_major_version' from source: facts 40074 1727204629.86220: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204629.86224: variable 'omit' from source: magic vars 40074 1727204629.86301: variable 'omit' from source: magic vars 40074 1727204629.86380: variable 'omit' from source: magic vars 40074 1727204629.86387: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204629.86437: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204629.86460: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204629.86528: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204629.86610: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204629.86650: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204629.86654: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204629.86657: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204629.87073: Set connection var ansible_pipelining to False 40074 1727204629.87077: Set connection var ansible_shell_executable to /bin/sh 40074 1727204629.87080: Set connection var ansible_shell_type to sh 40074 1727204629.87084: Set connection var ansible_connection to ssh 40074 1727204629.87086: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204629.87090: Set connection var ansible_timeout to 10 40074 1727204629.87147: variable 'ansible_shell_executable' from source: unknown 40074 1727204629.87152: variable 'ansible_connection' from source: unknown 40074 1727204629.87155: variable 'ansible_module_compression' from source: unknown 40074 1727204629.87157: variable 'ansible_shell_type' from source: unknown 40074 1727204629.87159: variable 'ansible_shell_executable' from source: unknown 40074 1727204629.87162: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204629.87164: variable 'ansible_pipelining' from source: unknown 40074 1727204629.87167: variable 'ansible_timeout' from source: unknown 40074 1727204629.87169: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204629.87712: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204629.87746: variable 'omit' from source: magic vars 40074 1727204629.87752: starting attempt loop 40074 1727204629.87755: running the handler 40074 1727204629.87759: _low_level_execute_command(): starting 40074 1727204629.87821: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204629.88711: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204629.88779: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204629.88782: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204629.88816: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204629.88898: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204629.90674: stdout chunk (state=3): >>>/root <<< 40074 1727204629.90813: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204629.90834: stderr chunk (state=3): >>><<< 40074 1727204629.90838: stdout chunk (state=3): >>><<< 40074 1727204629.90861: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204629.90874: _low_level_execute_command(): starting 40074 1727204629.90882: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909 `" && echo ansible-tmp-1727204629.9086246-41036-48432523489909="` echo /root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909 `" ) && sleep 0' 40074 1727204629.91333: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204629.91339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204629.91342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204629.91345: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204629.91399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204629.91406: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204629.91447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204629.93514: stdout chunk (state=3): >>>ansible-tmp-1727204629.9086246-41036-48432523489909=/root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909 <<< 40074 1727204629.93801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204629.93805: stdout chunk (state=3): >>><<< 40074 1727204629.93808: stderr chunk (state=3): >>><<< 40074 1727204629.93811: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204629.9086246-41036-48432523489909=/root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204629.93814: variable 'ansible_module_compression' from source: unknown 40074 1727204629.93862: ANSIBALLZ: Using lock for ping 40074 1727204629.93870: ANSIBALLZ: Acquiring lock 40074 1727204629.93878: ANSIBALLZ: Lock acquired: 139809958319824 40074 1727204629.93888: ANSIBALLZ: Creating module 40074 1727204630.12932: ANSIBALLZ: Writing module into payload 40074 1727204630.12979: ANSIBALLZ: Writing module 40074 1727204630.13000: ANSIBALLZ: Renaming module 40074 1727204630.13006: ANSIBALLZ: Done creating module 40074 1727204630.13022: variable 'ansible_facts' from source: unknown 40074 1727204630.13068: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909/AnsiballZ_ping.py 40074 1727204630.13193: Sending initial data 40074 1727204630.13196: Sent initial data (152 bytes) 40074 1727204630.13755: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204630.13787: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204630.13845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204630.13876: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204630.15652: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204630.15681: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204630.15718: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp4soijf34 /root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909/AnsiballZ_ping.py <<< 40074 1727204630.15722: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909/AnsiballZ_ping.py" <<< 40074 1727204630.15753: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp4soijf34" to remote "/root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909/AnsiballZ_ping.py" <<< 40074 1727204630.16508: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204630.16567: stderr chunk (state=3): >>><<< 40074 1727204630.16571: stdout chunk (state=3): >>><<< 40074 1727204630.16592: done transferring module to remote 40074 1727204630.16604: _low_level_execute_command(): starting 40074 1727204630.16610: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909/ /root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909/AnsiballZ_ping.py && sleep 0' 40074 1727204630.17050: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204630.17092: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204630.17096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204630.17099: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204630.17102: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204630.17105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204630.17161: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204630.17169: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204630.17204: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204630.19110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204630.19157: stderr chunk (state=3): >>><<< 40074 1727204630.19160: stdout chunk (state=3): >>><<< 40074 1727204630.19176: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204630.19179: _low_level_execute_command(): starting 40074 1727204630.19185: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909/AnsiballZ_ping.py && sleep 0' 40074 1727204630.19658: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204630.19661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204630.19664: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204630.19667: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204630.19711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204630.19715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204630.19766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204630.37022: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 40074 1727204630.38418: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204630.38479: stderr chunk (state=3): >>><<< 40074 1727204630.38483: stdout chunk (state=3): >>><<< 40074 1727204630.38502: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204630.38528: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204630.38541: _low_level_execute_command(): starting 40074 1727204630.38548: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204629.9086246-41036-48432523489909/ > /dev/null 2>&1 && sleep 0' 40074 1727204630.39045: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204630.39048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204630.39050: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204630.39053: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204630.39055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204630.39121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204630.39123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204630.39125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204630.39158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204630.41106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204630.41160: stderr chunk (state=3): >>><<< 40074 1727204630.41164: stdout chunk (state=3): >>><<< 40074 1727204630.41179: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204630.41187: handler run complete 40074 1727204630.41205: attempt loop complete, returning result 40074 1727204630.41208: _execute() done 40074 1727204630.41212: dumping result to json 40074 1727204630.41217: done dumping result, returning 40074 1727204630.41230: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-9fd7-2501-000000000030] 40074 1727204630.41236: sending task result for task 12b410aa-8751-9fd7-2501-000000000030 40074 1727204630.41337: done sending task result for task 12b410aa-8751-9fd7-2501-000000000030 40074 1727204630.41341: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 40074 1727204630.41417: no more pending results, returning what we have 40074 1727204630.41420: results queue empty 40074 1727204630.41421: checking for any_errors_fatal 40074 1727204630.41429: done checking for any_errors_fatal 40074 1727204630.41430: checking for max_fail_percentage 40074 1727204630.41431: done checking for max_fail_percentage 40074 1727204630.41432: checking to see if all hosts have failed and the running result is not ok 40074 1727204630.41434: done checking to see if all hosts have failed 40074 1727204630.41435: getting the remaining hosts for this loop 40074 1727204630.41436: done getting the remaining hosts for this loop 40074 1727204630.41441: getting the next task for host managed-node2 40074 1727204630.41452: done getting next task for host managed-node2 40074 1727204630.41454: ^ task is: TASK: meta (role_complete) 40074 1727204630.41457: ^ state is: HOST STATE: block=3, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204630.41469: getting variables 40074 1727204630.41471: in VariableManager get_vars() 40074 1727204630.41526: Calling all_inventory to load vars for managed-node2 40074 1727204630.41530: Calling groups_inventory to load vars for managed-node2 40074 1727204630.41532: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204630.41544: Calling all_plugins_play to load vars for managed-node2 40074 1727204630.41547: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204630.41551: Calling groups_plugins_play to load vars for managed-node2 40074 1727204630.42841: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204630.44481: done with get_vars() 40074 1727204630.44508: done getting variables 40074 1727204630.44581: done queuing things up, now waiting for results queue to drain 40074 1727204630.44583: results queue empty 40074 1727204630.44584: checking for any_errors_fatal 40074 1727204630.44586: done checking for any_errors_fatal 40074 1727204630.44586: checking for max_fail_percentage 40074 1727204630.44587: done checking for max_fail_percentage 40074 1727204630.44588: checking to see if all hosts have failed and the running result is not ok 40074 1727204630.44591: done checking to see if all hosts have failed 40074 1727204630.44592: getting the remaining hosts for this loop 40074 1727204630.44593: done getting the remaining hosts for this loop 40074 1727204630.44595: getting the next task for host managed-node2 40074 1727204630.44598: done getting next task for host managed-node2 40074 1727204630.44600: ^ task is: TASK: Get the IPv4 routes from the route table main 40074 1727204630.44601: ^ state is: HOST STATE: block=3, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204630.44603: getting variables 40074 1727204630.44604: in VariableManager get_vars() 40074 1727204630.44616: Calling all_inventory to load vars for managed-node2 40074 1727204630.44619: Calling groups_inventory to load vars for managed-node2 40074 1727204630.44621: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204630.44625: Calling all_plugins_play to load vars for managed-node2 40074 1727204630.44627: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204630.44629: Calling groups_plugins_play to load vars for managed-node2 40074 1727204630.46250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204630.47926: done with get_vars() 40074 1727204630.47951: done getting variables 40074 1727204630.47991: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the IPv4 routes from the route table main] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:73 Tuesday 24 September 2024 15:03:50 -0400 (0:00:00.636) 0:00:24.241 ***** 40074 1727204630.48015: entering _queue_task() for managed-node2/command 40074 1727204630.48367: worker is 1 (out of 1 available) 40074 1727204630.48382: exiting _queue_task() for managed-node2/command 40074 1727204630.48537: done queuing things up, now waiting for results queue to drain 40074 1727204630.48540: waiting for pending results... 40074 1727204630.48716: running TaskExecutor() for managed-node2/TASK: Get the IPv4 routes from the route table main 40074 1727204630.48782: in run() - task 12b410aa-8751-9fd7-2501-000000000060 40074 1727204630.48806: variable 'ansible_search_path' from source: unknown 40074 1727204630.48857: calling self._execute() 40074 1727204630.49186: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204630.49425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204630.49433: variable 'omit' from source: magic vars 40074 1727204630.49871: variable 'ansible_distribution_major_version' from source: facts 40074 1727204630.49896: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204630.49911: variable 'omit' from source: magic vars 40074 1727204630.49945: variable 'omit' from source: magic vars 40074 1727204630.50003: variable 'omit' from source: magic vars 40074 1727204630.50058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204630.50114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204630.50143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204630.50187: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204630.50196: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204630.50236: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204630.50296: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204630.50300: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204630.50392: Set connection var ansible_pipelining to False 40074 1727204630.50411: Set connection var ansible_shell_executable to /bin/sh 40074 1727204630.50419: Set connection var ansible_shell_type to sh 40074 1727204630.50427: Set connection var ansible_connection to ssh 40074 1727204630.50438: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204630.50451: Set connection var ansible_timeout to 10 40074 1727204630.50485: variable 'ansible_shell_executable' from source: unknown 40074 1727204630.50495: variable 'ansible_connection' from source: unknown 40074 1727204630.50514: variable 'ansible_module_compression' from source: unknown 40074 1727204630.50517: variable 'ansible_shell_type' from source: unknown 40074 1727204630.50520: variable 'ansible_shell_executable' from source: unknown 40074 1727204630.50595: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204630.50598: variable 'ansible_pipelining' from source: unknown 40074 1727204630.50601: variable 'ansible_timeout' from source: unknown 40074 1727204630.50603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204630.50716: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204630.50744: variable 'omit' from source: magic vars 40074 1727204630.50758: starting attempt loop 40074 1727204630.50766: running the handler 40074 1727204630.50787: _low_level_execute_command(): starting 40074 1727204630.50803: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204630.51566: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204630.51580: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204630.51602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204630.51726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204630.51761: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204630.51836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204630.53609: stdout chunk (state=3): >>>/root <<< 40074 1727204630.53804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204630.53807: stdout chunk (state=3): >>><<< 40074 1727204630.53809: stderr chunk (state=3): >>><<< 40074 1727204630.53895: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204630.53898: _low_level_execute_command(): starting 40074 1727204630.53901: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976 `" && echo ansible-tmp-1727204630.5383465-41056-11311711248976="` echo /root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976 `" ) && sleep 0' 40074 1727204630.54500: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204630.54517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204630.54540: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204630.54565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204630.54585: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204630.54662: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204630.54726: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204630.54744: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204630.54776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204630.54851: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204630.56917: stdout chunk (state=3): >>>ansible-tmp-1727204630.5383465-41056-11311711248976=/root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976 <<< 40074 1727204630.57109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204630.57128: stdout chunk (state=3): >>><<< 40074 1727204630.57140: stderr chunk (state=3): >>><<< 40074 1727204630.57295: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204630.5383465-41056-11311711248976=/root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204630.57298: variable 'ansible_module_compression' from source: unknown 40074 1727204630.57301: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204630.57322: variable 'ansible_facts' from source: unknown 40074 1727204630.57435: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976/AnsiballZ_command.py 40074 1727204630.57666: Sending initial data 40074 1727204630.57670: Sent initial data (155 bytes) 40074 1727204630.58359: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204630.58415: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204630.58434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204630.58522: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204630.58578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204630.58607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204630.58716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204630.60566: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204630.60599: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204630.60642: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp7xf6317b /root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976/AnsiballZ_command.py <<< 40074 1727204630.60644: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976/AnsiballZ_command.py" <<< 40074 1727204630.60673: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp7xf6317b" to remote "/root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976/AnsiballZ_command.py" <<< 40074 1727204630.61474: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204630.61534: stderr chunk (state=3): >>><<< 40074 1727204630.61538: stdout chunk (state=3): >>><<< 40074 1727204630.61562: done transferring module to remote 40074 1727204630.61575: _low_level_execute_command(): starting 40074 1727204630.61581: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976/ /root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976/AnsiballZ_command.py && sleep 0' 40074 1727204630.62209: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204630.62312: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204630.62358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204630.62393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204630.64370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204630.64414: stderr chunk (state=3): >>><<< 40074 1727204630.64418: stdout chunk (state=3): >>><<< 40074 1727204630.64435: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204630.64443: _low_level_execute_command(): starting 40074 1727204630.64446: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976/AnsiballZ_command.py && sleep 0' 40074 1727204630.65012: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204630.65055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204630.65071: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204630.65091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204630.65174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204630.83248: stdout chunk (state=3): >>> {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-24 15:03:50.826929", "end": "2024-09-24 15:03:50.831315", "delta": "0:00:00.004386", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204630.84931: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204630.85000: stderr chunk (state=3): >>><<< 40074 1727204630.85004: stdout chunk (state=3): >>><<< 40074 1727204630.85027: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \n198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 \n198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 \n198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 \n198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 ", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "route"], "start": "2024-09-24 15:03:50.826929", "end": "2024-09-24 15:03:50.831315", "delta": "0:00:00.004386", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204630.85067: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204630.85076: _low_level_execute_command(): starting 40074 1727204630.85082: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204630.5383465-41056-11311711248976/ > /dev/null 2>&1 && sleep 0' 40074 1727204630.85548: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204630.85553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204630.85591: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204630.85595: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204630.85598: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204630.85601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204630.85658: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204630.85662: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204630.85710: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204630.87666: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204630.87715: stderr chunk (state=3): >>><<< 40074 1727204630.87721: stdout chunk (state=3): >>><<< 40074 1727204630.87734: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204630.87741: handler run complete 40074 1727204630.87770: Evaluated conditional (False): False 40074 1727204630.87784: attempt loop complete, returning result 40074 1727204630.87787: _execute() done 40074 1727204630.87793: dumping result to json 40074 1727204630.87800: done dumping result, returning 40074 1727204630.87808: done running TaskExecutor() for managed-node2/TASK: Get the IPv4 routes from the route table main [12b410aa-8751-9fd7-2501-000000000060] 40074 1727204630.87814: sending task result for task 12b410aa-8751-9fd7-2501-000000000060 40074 1727204630.87927: done sending task result for task 12b410aa-8751-9fd7-2501-000000000060 40074 1727204630.87930: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "-4", "route" ], "delta": "0:00:00.004386", "end": "2024-09-24 15:03:50.831315", "rc": 0, "start": "2024-09-24 15:03:50.826929" } STDOUT: default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown 198.51.10.64/26 via 198.51.100.6 dev ethtest0 proto static metric 4 198.51.12.128/26 via 198.51.100.1 dev ethtest1 proto static metric 2 198.51.100.0/24 dev ethtest0 proto kernel scope link src 198.51.100.3 metric 103 198.51.100.0/24 dev ethtest1 proto kernel scope link src 198.51.100.6 metric 104 40074 1727204630.88026: no more pending results, returning what we have 40074 1727204630.88030: results queue empty 40074 1727204630.88031: checking for any_errors_fatal 40074 1727204630.88033: done checking for any_errors_fatal 40074 1727204630.88034: checking for max_fail_percentage 40074 1727204630.88036: done checking for max_fail_percentage 40074 1727204630.88037: checking to see if all hosts have failed and the running result is not ok 40074 1727204630.88038: done checking to see if all hosts have failed 40074 1727204630.88039: getting the remaining hosts for this loop 40074 1727204630.88041: done getting the remaining hosts for this loop 40074 1727204630.88045: getting the next task for host managed-node2 40074 1727204630.88052: done getting next task for host managed-node2 40074 1727204630.88055: ^ task is: TASK: Assert that the route table main contains the specified IPv4 routes 40074 1727204630.88057: ^ state is: HOST STATE: block=3, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204630.88061: getting variables 40074 1727204630.88062: in VariableManager get_vars() 40074 1727204630.88112: Calling all_inventory to load vars for managed-node2 40074 1727204630.88116: Calling groups_inventory to load vars for managed-node2 40074 1727204630.88121: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204630.88132: Calling all_plugins_play to load vars for managed-node2 40074 1727204630.88135: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204630.88139: Calling groups_plugins_play to load vars for managed-node2 40074 1727204630.89419: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204630.92399: done with get_vars() 40074 1727204630.92436: done getting variables 40074 1727204630.92518: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv4 routes] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:78 Tuesday 24 September 2024 15:03:50 -0400 (0:00:00.445) 0:00:24.687 ***** 40074 1727204630.92552: entering _queue_task() for managed-node2/assert 40074 1727204630.93013: worker is 1 (out of 1 available) 40074 1727204630.93062: exiting _queue_task() for managed-node2/assert 40074 1727204630.93075: done queuing things up, now waiting for results queue to drain 40074 1727204630.93076: waiting for pending results... 40074 1727204630.93912: running TaskExecutor() for managed-node2/TASK: Assert that the route table main contains the specified IPv4 routes 40074 1727204630.94196: in run() - task 12b410aa-8751-9fd7-2501-000000000061 40074 1727204630.94200: variable 'ansible_search_path' from source: unknown 40074 1727204630.94204: calling self._execute() 40074 1727204630.94281: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204630.94295: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204630.94448: variable 'omit' from source: magic vars 40074 1727204630.94952: variable 'ansible_distribution_major_version' from source: facts 40074 1727204630.94966: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204630.94970: variable 'omit' from source: magic vars 40074 1727204630.95001: variable 'omit' from source: magic vars 40074 1727204630.95048: variable 'omit' from source: magic vars 40074 1727204630.95088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204630.95124: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204630.95143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204630.95161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204630.95174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204630.95207: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204630.95211: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204630.95215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204630.95305: Set connection var ansible_pipelining to False 40074 1727204630.95312: Set connection var ansible_shell_executable to /bin/sh 40074 1727204630.95315: Set connection var ansible_shell_type to sh 40074 1727204630.95321: Set connection var ansible_connection to ssh 40074 1727204630.95327: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204630.95334: Set connection var ansible_timeout to 10 40074 1727204630.95358: variable 'ansible_shell_executable' from source: unknown 40074 1727204630.95361: variable 'ansible_connection' from source: unknown 40074 1727204630.95364: variable 'ansible_module_compression' from source: unknown 40074 1727204630.95367: variable 'ansible_shell_type' from source: unknown 40074 1727204630.95371: variable 'ansible_shell_executable' from source: unknown 40074 1727204630.95375: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204630.95380: variable 'ansible_pipelining' from source: unknown 40074 1727204630.95384: variable 'ansible_timeout' from source: unknown 40074 1727204630.95391: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204630.95512: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204630.95523: variable 'omit' from source: magic vars 40074 1727204630.95532: starting attempt loop 40074 1727204630.95535: running the handler 40074 1727204630.95682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204630.95882: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204630.95922: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204630.95993: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204630.96024: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204630.96102: variable 'route_table_main_ipv4' from source: set_fact 40074 1727204630.96132: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.10.64/26 via 198.51.100.6 dev ethtest0\s+(proto static )?metric 4")): True 40074 1727204630.96256: variable 'route_table_main_ipv4' from source: set_fact 40074 1727204630.96284: Evaluated conditional (route_table_main_ipv4.stdout is search("198.51.12.128/26 via 198.51.100.1 dev ethtest1\s+(proto static )?metric 2")): True 40074 1727204630.96292: handler run complete 40074 1727204630.96307: attempt loop complete, returning result 40074 1727204630.96310: _execute() done 40074 1727204630.96313: dumping result to json 40074 1727204630.96321: done dumping result, returning 40074 1727204630.96326: done running TaskExecutor() for managed-node2/TASK: Assert that the route table main contains the specified IPv4 routes [12b410aa-8751-9fd7-2501-000000000061] 40074 1727204630.96331: sending task result for task 12b410aa-8751-9fd7-2501-000000000061 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 40074 1727204630.96484: no more pending results, returning what we have 40074 1727204630.96488: results queue empty 40074 1727204630.96491: checking for any_errors_fatal 40074 1727204630.96505: done checking for any_errors_fatal 40074 1727204630.96506: checking for max_fail_percentage 40074 1727204630.96507: done checking for max_fail_percentage 40074 1727204630.96508: checking to see if all hosts have failed and the running result is not ok 40074 1727204630.96510: done checking to see if all hosts have failed 40074 1727204630.96511: getting the remaining hosts for this loop 40074 1727204630.96512: done getting the remaining hosts for this loop 40074 1727204630.96519: getting the next task for host managed-node2 40074 1727204630.96525: done getting next task for host managed-node2 40074 1727204630.96528: ^ task is: TASK: Get the IPv6 routes from the route table main 40074 1727204630.96531: ^ state is: HOST STATE: block=3, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204630.96534: getting variables 40074 1727204630.96536: in VariableManager get_vars() 40074 1727204630.96580: Calling all_inventory to load vars for managed-node2 40074 1727204630.96584: Calling groups_inventory to load vars for managed-node2 40074 1727204630.96586: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204630.96601: done sending task result for task 12b410aa-8751-9fd7-2501-000000000061 40074 1727204630.96604: WORKER PROCESS EXITING 40074 1727204630.96614: Calling all_plugins_play to load vars for managed-node2 40074 1727204630.96620: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204630.96624: Calling groups_plugins_play to load vars for managed-node2 40074 1727204630.98802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204631.01912: done with get_vars() 40074 1727204631.01960: done getting variables 40074 1727204631.02038: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the IPv6 routes from the route table main] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:89 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.095) 0:00:24.782 ***** 40074 1727204631.02077: entering _queue_task() for managed-node2/command 40074 1727204631.02466: worker is 1 (out of 1 available) 40074 1727204631.02479: exiting _queue_task() for managed-node2/command 40074 1727204631.02495: done queuing things up, now waiting for results queue to drain 40074 1727204631.02496: waiting for pending results... 40074 1727204631.02810: running TaskExecutor() for managed-node2/TASK: Get the IPv6 routes from the route table main 40074 1727204631.02933: in run() - task 12b410aa-8751-9fd7-2501-000000000062 40074 1727204631.02954: variable 'ansible_search_path' from source: unknown 40074 1727204631.03000: calling self._execute() 40074 1727204631.03126: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204631.03146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204631.03166: variable 'omit' from source: magic vars 40074 1727204631.03639: variable 'ansible_distribution_major_version' from source: facts 40074 1727204631.03658: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204631.03668: variable 'omit' from source: magic vars 40074 1727204631.03797: variable 'omit' from source: magic vars 40074 1727204631.03800: variable 'omit' from source: magic vars 40074 1727204631.03803: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204631.03848: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204631.03881: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204631.03919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204631.03943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204631.03988: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204631.04001: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204631.04014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204631.04164: Set connection var ansible_pipelining to False 40074 1727204631.04178: Set connection var ansible_shell_executable to /bin/sh 40074 1727204631.04186: Set connection var ansible_shell_type to sh 40074 1727204631.04197: Set connection var ansible_connection to ssh 40074 1727204631.04209: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204631.04225: Set connection var ansible_timeout to 10 40074 1727204631.04267: variable 'ansible_shell_executable' from source: unknown 40074 1727204631.04276: variable 'ansible_connection' from source: unknown 40074 1727204631.04284: variable 'ansible_module_compression' from source: unknown 40074 1727204631.04343: variable 'ansible_shell_type' from source: unknown 40074 1727204631.04346: variable 'ansible_shell_executable' from source: unknown 40074 1727204631.04349: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204631.04351: variable 'ansible_pipelining' from source: unknown 40074 1727204631.04354: variable 'ansible_timeout' from source: unknown 40074 1727204631.04356: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204631.04522: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204631.04544: variable 'omit' from source: magic vars 40074 1727204631.04560: starting attempt loop 40074 1727204631.04568: running the handler 40074 1727204631.04694: _low_level_execute_command(): starting 40074 1727204631.04698: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204631.05495: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204631.05549: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204631.05570: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204631.05607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204631.05682: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204631.07454: stdout chunk (state=3): >>>/root <<< 40074 1727204631.07648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204631.07683: stdout chunk (state=3): >>><<< 40074 1727204631.07687: stderr chunk (state=3): >>><<< 40074 1727204631.07714: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204631.07841: _low_level_execute_command(): starting 40074 1727204631.07847: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583 `" && echo ansible-tmp-1727204631.077241-41080-68920583974583="` echo /root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583 `" ) && sleep 0' 40074 1727204631.08582: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204631.08634: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204631.08981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 40074 1727204631.08996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204631.09085: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204631.09093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204631.11179: stdout chunk (state=3): >>>ansible-tmp-1727204631.077241-41080-68920583974583=/root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583 <<< 40074 1727204631.11392: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204631.11396: stdout chunk (state=3): >>><<< 40074 1727204631.11398: stderr chunk (state=3): >>><<< 40074 1727204631.11496: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204631.077241-41080-68920583974583=/root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204631.11500: variable 'ansible_module_compression' from source: unknown 40074 1727204631.11539: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204631.11588: variable 'ansible_facts' from source: unknown 40074 1727204631.11710: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583/AnsiballZ_command.py 40074 1727204631.11925: Sending initial data 40074 1727204631.11929: Sent initial data (154 bytes) 40074 1727204631.12679: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204631.12708: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204631.12784: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204631.14485: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204631.14559: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204631.14608: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpz0ujfqs6 /root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583/AnsiballZ_command.py <<< 40074 1727204631.14612: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583/AnsiballZ_command.py" <<< 40074 1727204631.14645: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpz0ujfqs6" to remote "/root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583/AnsiballZ_command.py" <<< 40074 1727204631.21661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204631.21739: stderr chunk (state=3): >>><<< 40074 1727204631.21743: stdout chunk (state=3): >>><<< 40074 1727204631.21764: done transferring module to remote 40074 1727204631.21782: _low_level_execute_command(): starting 40074 1727204631.21785: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583/ /root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583/AnsiballZ_command.py && sleep 0' 40074 1727204631.22269: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204631.22273: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204631.22275: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204631.22278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204631.22281: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204631.22335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204631.22341: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204631.22381: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204631.24358: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204631.24412: stderr chunk (state=3): >>><<< 40074 1727204631.24414: stdout chunk (state=3): >>><<< 40074 1727204631.24428: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204631.24496: _low_level_execute_command(): starting 40074 1727204631.24500: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583/AnsiballZ_command.py && sleep 0' 40074 1727204631.24882: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204631.24899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204631.24912: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204631.24960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204631.24977: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204631.25026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204631.43061: stdout chunk (state=3): >>> {"changed": true, "stdout": "2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 15:03:51.425398", "end": "2024-09-24 15:03:51.429485", "delta": "0:00:00.004087", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204631.44770: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204631.44834: stderr chunk (state=3): >>><<< 40074 1727204631.44838: stdout chunk (state=3): >>><<< 40074 1727204631.44855: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium\n2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium\n2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium\nfe80::/64 dev peerethtest0 proto kernel metric 256 pref medium\nfe80::/64 dev peerethtest1 proto kernel metric 256 pref medium\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest0 proto kernel metric 1024 pref medium\nfe80::/64 dev ethtest1 proto kernel metric 1024 pref medium", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "route"], "start": "2024-09-24 15:03:51.425398", "end": "2024-09-24 15:03:51.429485", "delta": "0:00:00.004087", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 route", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204631.44902: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 route', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204631.44911: _low_level_execute_command(): starting 40074 1727204631.44920: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204631.077241-41080-68920583974583/ > /dev/null 2>&1 && sleep 0' 40074 1727204631.45396: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204631.45400: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204631.45411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204631.45427: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204631.45469: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204631.45488: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204631.45528: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204631.47471: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204631.47527: stderr chunk (state=3): >>><<< 40074 1727204631.47531: stdout chunk (state=3): >>><<< 40074 1727204631.47546: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204631.47556: handler run complete 40074 1727204631.47579: Evaluated conditional (False): False 40074 1727204631.47595: attempt loop complete, returning result 40074 1727204631.47603: _execute() done 40074 1727204631.47606: dumping result to json 40074 1727204631.47611: done dumping result, returning 40074 1727204631.47621: done running TaskExecutor() for managed-node2/TASK: Get the IPv6 routes from the route table main [12b410aa-8751-9fd7-2501-000000000062] 40074 1727204631.47627: sending task result for task 12b410aa-8751-9fd7-2501-000000000062 40074 1727204631.47732: done sending task result for task 12b410aa-8751-9fd7-2501-000000000062 40074 1727204631.47735: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "-6", "route" ], "delta": "0:00:00.004087", "end": "2024-09-24 15:03:51.429485", "rc": 0, "start": "2024-09-24 15:03:51.425398" } STDOUT: 2001:db6::4 via 2001:db8::1 dev ethtest0 proto static metric 2 pref medium 2001:db8::/32 dev ethtest0 proto kernel metric 103 pref medium 2001:db8::/32 dev ethtest1 proto kernel metric 104 pref medium fe80::/64 dev peerethtest0 proto kernel metric 256 pref medium fe80::/64 dev peerethtest1 proto kernel metric 256 pref medium fe80::/64 dev eth0 proto kernel metric 1024 pref medium fe80::/64 dev ethtest0 proto kernel metric 1024 pref medium fe80::/64 dev ethtest1 proto kernel metric 1024 pref medium 40074 1727204631.47827: no more pending results, returning what we have 40074 1727204631.47831: results queue empty 40074 1727204631.47832: checking for any_errors_fatal 40074 1727204631.47844: done checking for any_errors_fatal 40074 1727204631.47845: checking for max_fail_percentage 40074 1727204631.47847: done checking for max_fail_percentage 40074 1727204631.47848: checking to see if all hosts have failed and the running result is not ok 40074 1727204631.47849: done checking to see if all hosts have failed 40074 1727204631.47850: getting the remaining hosts for this loop 40074 1727204631.47852: done getting the remaining hosts for this loop 40074 1727204631.47856: getting the next task for host managed-node2 40074 1727204631.47862: done getting next task for host managed-node2 40074 1727204631.47865: ^ task is: TASK: Assert that the route table main contains the specified IPv6 routes 40074 1727204631.47867: ^ state is: HOST STATE: block=3, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204631.47871: getting variables 40074 1727204631.47873: in VariableManager get_vars() 40074 1727204631.47923: Calling all_inventory to load vars for managed-node2 40074 1727204631.47927: Calling groups_inventory to load vars for managed-node2 40074 1727204631.47930: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204631.47942: Calling all_plugins_play to load vars for managed-node2 40074 1727204631.47945: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204631.47948: Calling groups_plugins_play to load vars for managed-node2 40074 1727204631.49372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204631.50991: done with get_vars() 40074 1727204631.51015: done getting variables 40074 1727204631.51069: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the route table main contains the specified IPv6 routes] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:94 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.490) 0:00:25.272 ***** 40074 1727204631.51094: entering _queue_task() for managed-node2/assert 40074 1727204631.51359: worker is 1 (out of 1 available) 40074 1727204631.51376: exiting _queue_task() for managed-node2/assert 40074 1727204631.51393: done queuing things up, now waiting for results queue to drain 40074 1727204631.51395: waiting for pending results... 40074 1727204631.51592: running TaskExecutor() for managed-node2/TASK: Assert that the route table main contains the specified IPv6 routes 40074 1727204631.51665: in run() - task 12b410aa-8751-9fd7-2501-000000000063 40074 1727204631.51678: variable 'ansible_search_path' from source: unknown 40074 1727204631.51712: calling self._execute() 40074 1727204631.51803: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204631.51810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204631.51824: variable 'omit' from source: magic vars 40074 1727204631.52164: variable 'ansible_distribution_major_version' from source: facts 40074 1727204631.52177: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204631.52181: variable 'omit' from source: magic vars 40074 1727204631.52205: variable 'omit' from source: magic vars 40074 1727204631.52238: variable 'omit' from source: magic vars 40074 1727204631.52277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204631.52313: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204631.52334: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204631.52352: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204631.52364: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204631.52396: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204631.52402: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204631.52404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204631.52695: Set connection var ansible_pipelining to False 40074 1727204631.52699: Set connection var ansible_shell_executable to /bin/sh 40074 1727204631.52702: Set connection var ansible_shell_type to sh 40074 1727204631.52704: Set connection var ansible_connection to ssh 40074 1727204631.52707: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204631.52710: Set connection var ansible_timeout to 10 40074 1727204631.52713: variable 'ansible_shell_executable' from source: unknown 40074 1727204631.52716: variable 'ansible_connection' from source: unknown 40074 1727204631.52721: variable 'ansible_module_compression' from source: unknown 40074 1727204631.52725: variable 'ansible_shell_type' from source: unknown 40074 1727204631.52728: variable 'ansible_shell_executable' from source: unknown 40074 1727204631.52730: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204631.52733: variable 'ansible_pipelining' from source: unknown 40074 1727204631.52736: variable 'ansible_timeout' from source: unknown 40074 1727204631.52739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204631.52858: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204631.52880: variable 'omit' from source: magic vars 40074 1727204631.52897: starting attempt loop 40074 1727204631.52906: running the handler 40074 1727204631.53129: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204631.53421: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204631.53508: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204631.53616: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204631.53668: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204631.53748: variable 'route_table_main_ipv6' from source: set_fact 40074 1727204631.53780: Evaluated conditional (route_table_main_ipv6.stdout is search("2001:db6::4 via 2001:db8::1 dev ethtest0\s+(proto static )?metric 2")): True 40074 1727204631.53787: handler run complete 40074 1727204631.53813: attempt loop complete, returning result 40074 1727204631.53819: _execute() done 40074 1727204631.53823: dumping result to json 40074 1727204631.53825: done dumping result, returning 40074 1727204631.53831: done running TaskExecutor() for managed-node2/TASK: Assert that the route table main contains the specified IPv6 routes [12b410aa-8751-9fd7-2501-000000000063] 40074 1727204631.53837: sending task result for task 12b410aa-8751-9fd7-2501-000000000063 40074 1727204631.53929: done sending task result for task 12b410aa-8751-9fd7-2501-000000000063 40074 1727204631.53932: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 40074 1727204631.53982: no more pending results, returning what we have 40074 1727204631.53986: results queue empty 40074 1727204631.53987: checking for any_errors_fatal 40074 1727204631.53999: done checking for any_errors_fatal 40074 1727204631.54000: checking for max_fail_percentage 40074 1727204631.54002: done checking for max_fail_percentage 40074 1727204631.54003: checking to see if all hosts have failed and the running result is not ok 40074 1727204631.54005: done checking to see if all hosts have failed 40074 1727204631.54006: getting the remaining hosts for this loop 40074 1727204631.54007: done getting the remaining hosts for this loop 40074 1727204631.54011: getting the next task for host managed-node2 40074 1727204631.54021: done getting next task for host managed-node2 40074 1727204631.54024: ^ task is: TASK: Get the interface1 MAC address 40074 1727204631.54026: ^ state is: HOST STATE: block=3, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204631.54029: getting variables 40074 1727204631.54031: in VariableManager get_vars() 40074 1727204631.54071: Calling all_inventory to load vars for managed-node2 40074 1727204631.54075: Calling groups_inventory to load vars for managed-node2 40074 1727204631.54077: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204631.54088: Calling all_plugins_play to load vars for managed-node2 40074 1727204631.54099: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204631.54104: Calling groups_plugins_play to load vars for managed-node2 40074 1727204631.55357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204631.58547: done with get_vars() 40074 1727204631.58587: done getting variables 40074 1727204631.58661: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get the interface1 MAC address] ****************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:99 Tuesday 24 September 2024 15:03:51 -0400 (0:00:00.076) 0:00:25.348 ***** 40074 1727204631.58700: entering _queue_task() for managed-node2/command 40074 1727204631.59229: worker is 1 (out of 1 available) 40074 1727204631.59242: exiting _queue_task() for managed-node2/command 40074 1727204631.59254: done queuing things up, now waiting for results queue to drain 40074 1727204631.59255: waiting for pending results... 40074 1727204631.59608: running TaskExecutor() for managed-node2/TASK: Get the interface1 MAC address 40074 1727204631.59614: in run() - task 12b410aa-8751-9fd7-2501-000000000064 40074 1727204631.59620: variable 'ansible_search_path' from source: unknown 40074 1727204631.59623: calling self._execute() 40074 1727204631.59733: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204631.59746: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204631.59763: variable 'omit' from source: magic vars 40074 1727204631.60260: variable 'ansible_distribution_major_version' from source: facts 40074 1727204631.60279: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204631.60291: variable 'omit' from source: magic vars 40074 1727204631.60325: variable 'omit' from source: magic vars 40074 1727204631.60471: variable 'interface1' from source: play vars 40074 1727204631.60501: variable 'omit' from source: magic vars 40074 1727204631.60552: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204631.60609: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204631.60638: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204631.60664: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204631.60789: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204631.60794: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204631.60797: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204631.60800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204631.60895: Set connection var ansible_pipelining to False 40074 1727204631.60914: Set connection var ansible_shell_executable to /bin/sh 40074 1727204631.60926: Set connection var ansible_shell_type to sh 40074 1727204631.60933: Set connection var ansible_connection to ssh 40074 1727204631.60946: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204631.60957: Set connection var ansible_timeout to 10 40074 1727204631.61006: variable 'ansible_shell_executable' from source: unknown 40074 1727204631.61010: variable 'ansible_connection' from source: unknown 40074 1727204631.61013: variable 'ansible_module_compression' from source: unknown 40074 1727204631.61015: variable 'ansible_shell_type' from source: unknown 40074 1727204631.61116: variable 'ansible_shell_executable' from source: unknown 40074 1727204631.61123: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204631.61125: variable 'ansible_pipelining' from source: unknown 40074 1727204631.61129: variable 'ansible_timeout' from source: unknown 40074 1727204631.61131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204631.61236: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204631.61336: variable 'omit' from source: magic vars 40074 1727204631.61339: starting attempt loop 40074 1727204631.61342: running the handler 40074 1727204631.61344: _low_level_execute_command(): starting 40074 1727204631.61347: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204631.62099: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204631.62232: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204631.62238: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204631.62258: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204631.62277: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204631.62307: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204631.62404: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204631.64160: stdout chunk (state=3): >>>/root <<< 40074 1727204631.64325: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204631.64384: stderr chunk (state=3): >>><<< 40074 1727204631.64412: stdout chunk (state=3): >>><<< 40074 1727204631.64449: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204631.64566: _low_level_execute_command(): starting 40074 1727204631.64570: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374 `" && echo ansible-tmp-1727204631.6445637-41098-131562815369374="` echo /root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374 `" ) && sleep 0' 40074 1727204631.65164: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204631.65178: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204631.65193: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204631.65212: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204631.65309: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204631.65314: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204631.65372: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204631.65396: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204631.65415: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204631.65486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204631.67593: stdout chunk (state=3): >>>ansible-tmp-1727204631.6445637-41098-131562815369374=/root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374 <<< 40074 1727204631.67753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204631.67757: stdout chunk (state=3): >>><<< 40074 1727204631.67759: stderr chunk (state=3): >>><<< 40074 1727204631.67776: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204631.6445637-41098-131562815369374=/root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204631.67932: variable 'ansible_module_compression' from source: unknown 40074 1727204631.67937: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204631.67939: variable 'ansible_facts' from source: unknown 40074 1727204631.68043: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374/AnsiballZ_command.py 40074 1727204631.68302: Sending initial data 40074 1727204631.68305: Sent initial data (156 bytes) 40074 1727204631.68885: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204631.68900: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204631.68939: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204631.68953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204631.69051: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204631.69078: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204631.69098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204631.69122: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204631.69199: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204631.70900: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204631.70960: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204631.71028: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpg8hvt67l /root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374/AnsiballZ_command.py <<< 40074 1727204631.71032: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374/AnsiballZ_command.py" <<< 40074 1727204631.71066: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpg8hvt67l" to remote "/root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374/AnsiballZ_command.py" <<< 40074 1727204631.72227: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204631.72261: stderr chunk (state=3): >>><<< 40074 1727204631.72275: stdout chunk (state=3): >>><<< 40074 1727204631.72371: done transferring module to remote 40074 1727204631.72376: _low_level_execute_command(): starting 40074 1727204631.72379: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374/ /root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374/AnsiballZ_command.py && sleep 0' 40074 1727204631.72980: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204631.72998: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204631.73013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204631.73136: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204631.73156: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204631.73172: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204631.73192: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204631.73215: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204631.73291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204631.75323: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204631.75334: stdout chunk (state=3): >>><<< 40074 1727204631.75347: stderr chunk (state=3): >>><<< 40074 1727204631.75370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204631.75379: _low_level_execute_command(): starting 40074 1727204631.75391: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374/AnsiballZ_command.py && sleep 0' 40074 1727204631.76027: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204631.76050: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204631.76068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204631.76086: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204631.76160: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204631.76214: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204631.76245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204631.76327: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204631.94379: stdout chunk (state=3): >>> {"changed": true, "stdout": "be:be:47:b2:eb:46", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-24 15:03:51.939149", "end": "2024-09-24 15:03:51.942712", "delta": "0:00:00.003563", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204631.96344: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204631.96364: stdout chunk (state=3): >>><<< 40074 1727204631.96414: stderr chunk (state=3): >>><<< 40074 1727204631.96499: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "be:be:47:b2:eb:46", "stderr": "", "rc": 0, "cmd": ["cat", "/sys/class/net/ethtest1/address"], "start": "2024-09-24 15:03:51.939149", "end": "2024-09-24 15:03:51.942712", "delta": "0:00:00.003563", "msg": "", "invocation": {"module_args": {"_raw_params": "cat /sys/class/net/ethtest1/address", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204631.96668: done with _execute_module (ansible.legacy.command, {'_raw_params': 'cat /sys/class/net/ethtest1/address', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204631.96777: _low_level_execute_command(): starting 40074 1727204631.96781: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204631.6445637-41098-131562815369374/ > /dev/null 2>&1 && sleep 0' 40074 1727204631.97379: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204631.97405: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204631.97506: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204631.97526: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204631.97627: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204631.99604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204631.99694: stderr chunk (state=3): >>><<< 40074 1727204631.99697: stdout chunk (state=3): >>><<< 40074 1727204631.99721: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204631.99727: handler run complete 40074 1727204631.99763: Evaluated conditional (False): False 40074 1727204631.99775: attempt loop complete, returning result 40074 1727204631.99785: _execute() done 40074 1727204631.99793: dumping result to json 40074 1727204632.00093: done dumping result, returning 40074 1727204632.00096: done running TaskExecutor() for managed-node2/TASK: Get the interface1 MAC address [12b410aa-8751-9fd7-2501-000000000064] 40074 1727204632.00098: sending task result for task 12b410aa-8751-9fd7-2501-000000000064 40074 1727204632.00170: done sending task result for task 12b410aa-8751-9fd7-2501-000000000064 40074 1727204632.00173: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "cat", "/sys/class/net/ethtest1/address" ], "delta": "0:00:00.003563", "end": "2024-09-24 15:03:51.942712", "rc": 0, "start": "2024-09-24 15:03:51.939149" } STDOUT: be:be:47:b2:eb:46 40074 1727204632.00271: no more pending results, returning what we have 40074 1727204632.00275: results queue empty 40074 1727204632.00276: checking for any_errors_fatal 40074 1727204632.00286: done checking for any_errors_fatal 40074 1727204632.00287: checking for max_fail_percentage 40074 1727204632.00291: done checking for max_fail_percentage 40074 1727204632.00292: checking to see if all hosts have failed and the running result is not ok 40074 1727204632.00294: done checking to see if all hosts have failed 40074 1727204632.00295: getting the remaining hosts for this loop 40074 1727204632.00296: done getting the remaining hosts for this loop 40074 1727204632.00301: getting the next task for host managed-node2 40074 1727204632.00309: done getting next task for host managed-node2 40074 1727204632.00319: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 40074 1727204632.00323: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204632.00346: getting variables 40074 1727204632.00348: in VariableManager get_vars() 40074 1727204632.00629: Calling all_inventory to load vars for managed-node2 40074 1727204632.00633: Calling groups_inventory to load vars for managed-node2 40074 1727204632.00636: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204632.00646: Calling all_plugins_play to load vars for managed-node2 40074 1727204632.00650: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204632.00653: Calling groups_plugins_play to load vars for managed-node2 40074 1727204632.03026: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204632.06056: done with get_vars() 40074 1727204632.06102: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.475) 0:00:25.823 ***** 40074 1727204632.06224: entering _queue_task() for managed-node2/include_tasks 40074 1727204632.06604: worker is 1 (out of 1 available) 40074 1727204632.06619: exiting _queue_task() for managed-node2/include_tasks 40074 1727204632.06633: done queuing things up, now waiting for results queue to drain 40074 1727204632.06635: waiting for pending results... 40074 1727204632.07111: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 40074 1727204632.07124: in run() - task 12b410aa-8751-9fd7-2501-00000000006c 40074 1727204632.07150: variable 'ansible_search_path' from source: unknown 40074 1727204632.07161: variable 'ansible_search_path' from source: unknown 40074 1727204632.07211: calling self._execute() 40074 1727204632.07330: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204632.07495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204632.07500: variable 'omit' from source: magic vars 40074 1727204632.07809: variable 'ansible_distribution_major_version' from source: facts 40074 1727204632.07829: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204632.07843: _execute() done 40074 1727204632.07854: dumping result to json 40074 1727204632.07864: done dumping result, returning 40074 1727204632.07896: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-9fd7-2501-00000000006c] 40074 1727204632.07900: sending task result for task 12b410aa-8751-9fd7-2501-00000000006c 40074 1727204632.08159: done sending task result for task 12b410aa-8751-9fd7-2501-00000000006c 40074 1727204632.08163: WORKER PROCESS EXITING 40074 1727204632.08212: no more pending results, returning what we have 40074 1727204632.08217: in VariableManager get_vars() 40074 1727204632.08265: Calling all_inventory to load vars for managed-node2 40074 1727204632.08269: Calling groups_inventory to load vars for managed-node2 40074 1727204632.08272: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204632.08285: Calling all_plugins_play to load vars for managed-node2 40074 1727204632.08292: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204632.08297: Calling groups_plugins_play to load vars for managed-node2 40074 1727204632.10751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204632.13741: done with get_vars() 40074 1727204632.13791: variable 'ansible_search_path' from source: unknown 40074 1727204632.13794: variable 'ansible_search_path' from source: unknown 40074 1727204632.13847: we have included files to process 40074 1727204632.13849: generating all_blocks data 40074 1727204632.13852: done generating all_blocks data 40074 1727204632.13859: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 40074 1727204632.13860: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 40074 1727204632.13863: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 40074 1727204632.14617: done processing included file 40074 1727204632.14619: iterating over new_blocks loaded from include file 40074 1727204632.14621: in VariableManager get_vars() 40074 1727204632.14654: done with get_vars() 40074 1727204632.14656: filtering new block on tags 40074 1727204632.14680: done filtering new block on tags 40074 1727204632.14684: in VariableManager get_vars() 40074 1727204632.14718: done with get_vars() 40074 1727204632.14720: filtering new block on tags 40074 1727204632.14749: done filtering new block on tags 40074 1727204632.14753: in VariableManager get_vars() 40074 1727204632.14783: done with get_vars() 40074 1727204632.14785: filtering new block on tags 40074 1727204632.14813: done filtering new block on tags 40074 1727204632.14816: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 40074 1727204632.14823: extending task lists for all hosts with included blocks 40074 1727204632.15991: done extending task lists 40074 1727204632.15993: done processing included files 40074 1727204632.15994: results queue empty 40074 1727204632.15995: checking for any_errors_fatal 40074 1727204632.16001: done checking for any_errors_fatal 40074 1727204632.16003: checking for max_fail_percentage 40074 1727204632.16004: done checking for max_fail_percentage 40074 1727204632.16005: checking to see if all hosts have failed and the running result is not ok 40074 1727204632.16007: done checking to see if all hosts have failed 40074 1727204632.16008: getting the remaining hosts for this loop 40074 1727204632.16009: done getting the remaining hosts for this loop 40074 1727204632.16012: getting the next task for host managed-node2 40074 1727204632.16018: done getting next task for host managed-node2 40074 1727204632.16021: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 40074 1727204632.16025: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204632.16037: getting variables 40074 1727204632.16038: in VariableManager get_vars() 40074 1727204632.16060: Calling all_inventory to load vars for managed-node2 40074 1727204632.16063: Calling groups_inventory to load vars for managed-node2 40074 1727204632.16067: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204632.16074: Calling all_plugins_play to load vars for managed-node2 40074 1727204632.16078: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204632.16082: Calling groups_plugins_play to load vars for managed-node2 40074 1727204632.18207: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204632.21315: done with get_vars() 40074 1727204632.21351: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.152) 0:00:25.976 ***** 40074 1727204632.21456: entering _queue_task() for managed-node2/setup 40074 1727204632.21850: worker is 1 (out of 1 available) 40074 1727204632.21864: exiting _queue_task() for managed-node2/setup 40074 1727204632.21878: done queuing things up, now waiting for results queue to drain 40074 1727204632.21881: waiting for pending results... 40074 1727204632.22311: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 40074 1727204632.22411: in run() - task 12b410aa-8751-9fd7-2501-000000000563 40074 1727204632.22434: variable 'ansible_search_path' from source: unknown 40074 1727204632.22444: variable 'ansible_search_path' from source: unknown 40074 1727204632.22490: calling self._execute() 40074 1727204632.22606: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204632.22627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204632.22644: variable 'omit' from source: magic vars 40074 1727204632.23108: variable 'ansible_distribution_major_version' from source: facts 40074 1727204632.23128: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204632.23439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204632.26054: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204632.26153: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204632.26209: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204632.26258: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204632.26297: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204632.26434: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204632.26444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204632.26481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204632.26543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204632.26567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204632.26638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204632.26760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204632.26763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204632.26768: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204632.26794: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204632.26997: variable '__network_required_facts' from source: role '' defaults 40074 1727204632.27014: variable 'ansible_facts' from source: unknown 40074 1727204632.28226: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 40074 1727204632.28235: when evaluation is False, skipping this task 40074 1727204632.28245: _execute() done 40074 1727204632.28254: dumping result to json 40074 1727204632.28263: done dumping result, returning 40074 1727204632.28281: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-9fd7-2501-000000000563] 40074 1727204632.28297: sending task result for task 12b410aa-8751-9fd7-2501-000000000563 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204632.28453: no more pending results, returning what we have 40074 1727204632.28458: results queue empty 40074 1727204632.28459: checking for any_errors_fatal 40074 1727204632.28461: done checking for any_errors_fatal 40074 1727204632.28462: checking for max_fail_percentage 40074 1727204632.28464: done checking for max_fail_percentage 40074 1727204632.28465: checking to see if all hosts have failed and the running result is not ok 40074 1727204632.28466: done checking to see if all hosts have failed 40074 1727204632.28467: getting the remaining hosts for this loop 40074 1727204632.28469: done getting the remaining hosts for this loop 40074 1727204632.28474: getting the next task for host managed-node2 40074 1727204632.28486: done getting next task for host managed-node2 40074 1727204632.28492: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 40074 1727204632.28497: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204632.28519: getting variables 40074 1727204632.28521: in VariableManager get_vars() 40074 1727204632.28570: Calling all_inventory to load vars for managed-node2 40074 1727204632.28574: Calling groups_inventory to load vars for managed-node2 40074 1727204632.28577: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204632.28894: Calling all_plugins_play to load vars for managed-node2 40074 1727204632.28900: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204632.28906: Calling groups_plugins_play to load vars for managed-node2 40074 1727204632.29606: done sending task result for task 12b410aa-8751-9fd7-2501-000000000563 40074 1727204632.29610: WORKER PROCESS EXITING 40074 1727204632.31197: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204632.34426: done with get_vars() 40074 1727204632.34461: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.131) 0:00:26.107 ***** 40074 1727204632.34599: entering _queue_task() for managed-node2/stat 40074 1727204632.35022: worker is 1 (out of 1 available) 40074 1727204632.35037: exiting _queue_task() for managed-node2/stat 40074 1727204632.35050: done queuing things up, now waiting for results queue to drain 40074 1727204632.35052: waiting for pending results... 40074 1727204632.35344: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 40074 1727204632.35556: in run() - task 12b410aa-8751-9fd7-2501-000000000565 40074 1727204632.35581: variable 'ansible_search_path' from source: unknown 40074 1727204632.35593: variable 'ansible_search_path' from source: unknown 40074 1727204632.35648: calling self._execute() 40074 1727204632.35771: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204632.35785: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204632.35806: variable 'omit' from source: magic vars 40074 1727204632.36308: variable 'ansible_distribution_major_version' from source: facts 40074 1727204632.36330: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204632.36561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204632.36904: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204632.36974: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204632.37025: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204632.37080: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204632.37197: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204632.37236: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204632.37285: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204632.37328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204632.37444: variable '__network_is_ostree' from source: set_fact 40074 1727204632.37458: Evaluated conditional (not __network_is_ostree is defined): False 40074 1727204632.37473: when evaluation is False, skipping this task 40074 1727204632.37487: _execute() done 40074 1727204632.37500: dumping result to json 40074 1727204632.37510: done dumping result, returning 40074 1727204632.37524: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-9fd7-2501-000000000565] 40074 1727204632.37536: sending task result for task 12b410aa-8751-9fd7-2501-000000000565 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 40074 1727204632.37739: no more pending results, returning what we have 40074 1727204632.37744: results queue empty 40074 1727204632.37745: checking for any_errors_fatal 40074 1727204632.37756: done checking for any_errors_fatal 40074 1727204632.37757: checking for max_fail_percentage 40074 1727204632.37759: done checking for max_fail_percentage 40074 1727204632.37760: checking to see if all hosts have failed and the running result is not ok 40074 1727204632.37761: done checking to see if all hosts have failed 40074 1727204632.37762: getting the remaining hosts for this loop 40074 1727204632.37764: done getting the remaining hosts for this loop 40074 1727204632.37769: getting the next task for host managed-node2 40074 1727204632.37778: done getting next task for host managed-node2 40074 1727204632.37783: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 40074 1727204632.37788: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204632.37813: getting variables 40074 1727204632.37815: in VariableManager get_vars() 40074 1727204632.37866: Calling all_inventory to load vars for managed-node2 40074 1727204632.37870: Calling groups_inventory to load vars for managed-node2 40074 1727204632.37873: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204632.37886: Calling all_plugins_play to load vars for managed-node2 40074 1727204632.38107: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204632.38115: done sending task result for task 12b410aa-8751-9fd7-2501-000000000565 40074 1727204632.38121: WORKER PROCESS EXITING 40074 1727204632.38127: Calling groups_plugins_play to load vars for managed-node2 40074 1727204632.40751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204632.43931: done with get_vars() 40074 1727204632.43968: done getting variables 40074 1727204632.44046: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.094) 0:00:26.202 ***** 40074 1727204632.44088: entering _queue_task() for managed-node2/set_fact 40074 1727204632.44619: worker is 1 (out of 1 available) 40074 1727204632.44632: exiting _queue_task() for managed-node2/set_fact 40074 1727204632.44645: done queuing things up, now waiting for results queue to drain 40074 1727204632.44647: waiting for pending results... 40074 1727204632.44825: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 40074 1727204632.45048: in run() - task 12b410aa-8751-9fd7-2501-000000000566 40074 1727204632.45073: variable 'ansible_search_path' from source: unknown 40074 1727204632.45081: variable 'ansible_search_path' from source: unknown 40074 1727204632.45135: calling self._execute() 40074 1727204632.45259: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204632.45279: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204632.45412: variable 'omit' from source: magic vars 40074 1727204632.45792: variable 'ansible_distribution_major_version' from source: facts 40074 1727204632.45813: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204632.46059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204632.46413: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204632.46473: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204632.46531: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204632.46575: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204632.46683: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204632.46734: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204632.46772: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204632.46810: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204632.46946: variable '__network_is_ostree' from source: set_fact 40074 1727204632.46950: Evaluated conditional (not __network_is_ostree is defined): False 40074 1727204632.46955: when evaluation is False, skipping this task 40074 1727204632.46963: _execute() done 40074 1727204632.47057: dumping result to json 40074 1727204632.47060: done dumping result, returning 40074 1727204632.47064: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-9fd7-2501-000000000566] 40074 1727204632.47066: sending task result for task 12b410aa-8751-9fd7-2501-000000000566 40074 1727204632.47144: done sending task result for task 12b410aa-8751-9fd7-2501-000000000566 40074 1727204632.47148: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 40074 1727204632.47214: no more pending results, returning what we have 40074 1727204632.47221: results queue empty 40074 1727204632.47222: checking for any_errors_fatal 40074 1727204632.47232: done checking for any_errors_fatal 40074 1727204632.47233: checking for max_fail_percentage 40074 1727204632.47234: done checking for max_fail_percentage 40074 1727204632.47236: checking to see if all hosts have failed and the running result is not ok 40074 1727204632.47237: done checking to see if all hosts have failed 40074 1727204632.47238: getting the remaining hosts for this loop 40074 1727204632.47240: done getting the remaining hosts for this loop 40074 1727204632.47245: getting the next task for host managed-node2 40074 1727204632.47257: done getting next task for host managed-node2 40074 1727204632.47262: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 40074 1727204632.47267: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204632.47292: getting variables 40074 1727204632.47294: in VariableManager get_vars() 40074 1727204632.47349: Calling all_inventory to load vars for managed-node2 40074 1727204632.47353: Calling groups_inventory to load vars for managed-node2 40074 1727204632.47356: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204632.47370: Calling all_plugins_play to load vars for managed-node2 40074 1727204632.47374: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204632.47378: Calling groups_plugins_play to load vars for managed-node2 40074 1727204632.50080: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204632.53484: done with get_vars() 40074 1727204632.53526: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:03:52 -0400 (0:00:00.095) 0:00:26.298 ***** 40074 1727204632.53654: entering _queue_task() for managed-node2/service_facts 40074 1727204632.54156: worker is 1 (out of 1 available) 40074 1727204632.54171: exiting _queue_task() for managed-node2/service_facts 40074 1727204632.54184: done queuing things up, now waiting for results queue to drain 40074 1727204632.54185: waiting for pending results... 40074 1727204632.54459: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 40074 1727204632.54696: in run() - task 12b410aa-8751-9fd7-2501-000000000568 40074 1727204632.54700: variable 'ansible_search_path' from source: unknown 40074 1727204632.54703: variable 'ansible_search_path' from source: unknown 40074 1727204632.54738: calling self._execute() 40074 1727204632.54859: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204632.54989: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204632.54995: variable 'omit' from source: magic vars 40074 1727204632.55375: variable 'ansible_distribution_major_version' from source: facts 40074 1727204632.55398: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204632.55410: variable 'omit' from source: magic vars 40074 1727204632.55523: variable 'omit' from source: magic vars 40074 1727204632.55584: variable 'omit' from source: magic vars 40074 1727204632.55644: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204632.55696: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204632.55727: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204632.55763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204632.55783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204632.55862: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204632.55868: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204632.55873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204632.56002: Set connection var ansible_pipelining to False 40074 1727204632.56015: Set connection var ansible_shell_executable to /bin/sh 40074 1727204632.56027: Set connection var ansible_shell_type to sh 40074 1727204632.56080: Set connection var ansible_connection to ssh 40074 1727204632.56083: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204632.56087: Set connection var ansible_timeout to 10 40074 1727204632.56094: variable 'ansible_shell_executable' from source: unknown 40074 1727204632.56104: variable 'ansible_connection' from source: unknown 40074 1727204632.56111: variable 'ansible_module_compression' from source: unknown 40074 1727204632.56120: variable 'ansible_shell_type' from source: unknown 40074 1727204632.56128: variable 'ansible_shell_executable' from source: unknown 40074 1727204632.56134: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204632.56142: variable 'ansible_pipelining' from source: unknown 40074 1727204632.56148: variable 'ansible_timeout' from source: unknown 40074 1727204632.56156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204632.56407: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204632.56494: variable 'omit' from source: magic vars 40074 1727204632.56498: starting attempt loop 40074 1727204632.56500: running the handler 40074 1727204632.56502: _low_level_execute_command(): starting 40074 1727204632.56505: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204632.57280: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204632.57346: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204632.57361: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204632.57451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204632.57470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204632.57495: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204632.57580: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204632.59352: stdout chunk (state=3): >>>/root <<< 40074 1727204632.59561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204632.59564: stdout chunk (state=3): >>><<< 40074 1727204632.59567: stderr chunk (state=3): >>><<< 40074 1727204632.59588: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204632.59698: _low_level_execute_command(): starting 40074 1727204632.59702: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288 `" && echo ansible-tmp-1727204632.595964-41181-270133093081288="` echo /root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288 `" ) && sleep 0' 40074 1727204632.60276: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204632.60294: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204632.60309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204632.60345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204632.60458: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204632.60488: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204632.60563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204632.62679: stdout chunk (state=3): >>>ansible-tmp-1727204632.595964-41181-270133093081288=/root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288 <<< 40074 1727204632.62897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204632.62901: stdout chunk (state=3): >>><<< 40074 1727204632.62903: stderr chunk (state=3): >>><<< 40074 1727204632.63098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204632.595964-41181-270133093081288=/root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204632.63102: variable 'ansible_module_compression' from source: unknown 40074 1727204632.63106: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 40074 1727204632.63108: variable 'ansible_facts' from source: unknown 40074 1727204632.63186: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288/AnsiballZ_service_facts.py 40074 1727204632.63352: Sending initial data 40074 1727204632.63435: Sent initial data (161 bytes) 40074 1727204632.64078: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204632.64114: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204632.64120: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204632.64211: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204632.64256: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204632.64273: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204632.64297: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204632.64378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204632.66078: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204632.66142: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204632.66186: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp4ddun5iv /root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288/AnsiballZ_service_facts.py <<< 40074 1727204632.66240: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288/AnsiballZ_service_facts.py" <<< 40074 1727204632.66250: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp4ddun5iv" to remote "/root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288/AnsiballZ_service_facts.py" <<< 40074 1727204632.67661: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204632.67665: stdout chunk (state=3): >>><<< 40074 1727204632.67667: stderr chunk (state=3): >>><<< 40074 1727204632.67670: done transferring module to remote 40074 1727204632.67672: _low_level_execute_command(): starting 40074 1727204632.67674: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288/ /root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288/AnsiballZ_service_facts.py && sleep 0' 40074 1727204632.68263: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204632.68279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204632.68298: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204632.68321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204632.68349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204632.68457: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204632.68475: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204632.68493: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204632.68515: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204632.68586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204632.70592: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204632.70613: stdout chunk (state=3): >>><<< 40074 1727204632.70630: stderr chunk (state=3): >>><<< 40074 1727204632.70650: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204632.70659: _low_level_execute_command(): starting 40074 1727204632.70670: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288/AnsiballZ_service_facts.py && sleep 0' 40074 1727204632.71331: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204632.71355: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204632.71373: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204632.71394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204632.71474: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204632.71528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204632.71545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204632.71575: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204632.71655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204635.75358: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name"<<< 40074 1727204635.75373: stdout chunk (state=3): >>>: "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 40074 1727204635.77071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204635.77075: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 40074 1727204635.77099: stderr chunk (state=3): >>><<< 40074 1727204635.77119: stdout chunk (state=3): >>><<< 40074 1727204635.77142: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204635.78487: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204635.78494: _low_level_execute_command(): starting 40074 1727204635.78497: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204632.595964-41181-270133093081288/ > /dev/null 2>&1 && sleep 0' 40074 1727204635.79139: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204635.79161: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204635.79260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204635.79297: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204635.79317: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204635.79342: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204635.79429: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204635.81498: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204635.81502: stdout chunk (state=3): >>><<< 40074 1727204635.81698: stderr chunk (state=3): >>><<< 40074 1727204635.81702: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204635.81705: handler run complete 40074 1727204635.81958: variable 'ansible_facts' from source: unknown 40074 1727204635.82226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204635.83278: variable 'ansible_facts' from source: unknown 40074 1727204635.83486: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204635.83947: attempt loop complete, returning result 40074 1727204635.83955: _execute() done 40074 1727204635.83958: dumping result to json 40074 1727204635.84095: done dumping result, returning 40074 1727204635.84136: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-9fd7-2501-000000000568] 40074 1727204635.84142: sending task result for task 12b410aa-8751-9fd7-2501-000000000568 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204635.85681: no more pending results, returning what we have 40074 1727204635.85685: results queue empty 40074 1727204635.85686: checking for any_errors_fatal 40074 1727204635.85694: done checking for any_errors_fatal 40074 1727204635.85695: checking for max_fail_percentage 40074 1727204635.85697: done checking for max_fail_percentage 40074 1727204635.85698: checking to see if all hosts have failed and the running result is not ok 40074 1727204635.85699: done checking to see if all hosts have failed 40074 1727204635.85700: getting the remaining hosts for this loop 40074 1727204635.85702: done getting the remaining hosts for this loop 40074 1727204635.85706: getting the next task for host managed-node2 40074 1727204635.85713: done getting next task for host managed-node2 40074 1727204635.85718: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 40074 1727204635.85722: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204635.85735: getting variables 40074 1727204635.85737: in VariableManager get_vars() 40074 1727204635.85777: Calling all_inventory to load vars for managed-node2 40074 1727204635.85780: Calling groups_inventory to load vars for managed-node2 40074 1727204635.85783: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204635.85855: Calling all_plugins_play to load vars for managed-node2 40074 1727204635.85860: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204635.85867: done sending task result for task 12b410aa-8751-9fd7-2501-000000000568 40074 1727204635.85870: WORKER PROCESS EXITING 40074 1727204635.85875: Calling groups_plugins_play to load vars for managed-node2 40074 1727204635.88403: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204635.91370: done with get_vars() 40074 1727204635.91424: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:03:55 -0400 (0:00:03.378) 0:00:29.677 ***** 40074 1727204635.91552: entering _queue_task() for managed-node2/package_facts 40074 1727204635.91940: worker is 1 (out of 1 available) 40074 1727204635.91955: exiting _queue_task() for managed-node2/package_facts 40074 1727204635.91968: done queuing things up, now waiting for results queue to drain 40074 1727204635.91969: waiting for pending results... 40074 1727204635.92302: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 40074 1727204635.92519: in run() - task 12b410aa-8751-9fd7-2501-000000000569 40074 1727204635.92548: variable 'ansible_search_path' from source: unknown 40074 1727204635.92595: variable 'ansible_search_path' from source: unknown 40074 1727204635.92606: calling self._execute() 40074 1727204635.92721: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204635.92736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204635.92759: variable 'omit' from source: magic vars 40074 1727204635.93232: variable 'ansible_distribution_major_version' from source: facts 40074 1727204635.93297: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204635.93301: variable 'omit' from source: magic vars 40074 1727204635.93363: variable 'omit' from source: magic vars 40074 1727204635.93410: variable 'omit' from source: magic vars 40074 1727204635.93464: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204635.93528: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204635.93548: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204635.93794: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204635.93798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204635.93801: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204635.93803: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204635.93805: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204635.93808: Set connection var ansible_pipelining to False 40074 1727204635.93810: Set connection var ansible_shell_executable to /bin/sh 40074 1727204635.93812: Set connection var ansible_shell_type to sh 40074 1727204635.93815: Set connection var ansible_connection to ssh 40074 1727204635.93817: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204635.93819: Set connection var ansible_timeout to 10 40074 1727204635.93852: variable 'ansible_shell_executable' from source: unknown 40074 1727204635.93861: variable 'ansible_connection' from source: unknown 40074 1727204635.93869: variable 'ansible_module_compression' from source: unknown 40074 1727204635.93877: variable 'ansible_shell_type' from source: unknown 40074 1727204635.93887: variable 'ansible_shell_executable' from source: unknown 40074 1727204635.93897: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204635.93906: variable 'ansible_pipelining' from source: unknown 40074 1727204635.93913: variable 'ansible_timeout' from source: unknown 40074 1727204635.93922: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204635.94167: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204635.94186: variable 'omit' from source: magic vars 40074 1727204635.94200: starting attempt loop 40074 1727204635.94208: running the handler 40074 1727204635.94229: _low_level_execute_command(): starting 40074 1727204635.94241: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204635.95001: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204635.95024: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204635.95110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204635.95162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204635.95183: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204635.95210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204635.95283: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204635.97102: stdout chunk (state=3): >>>/root <<< 40074 1727204635.97283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204635.97434: stdout chunk (state=3): >>><<< 40074 1727204635.97438: stderr chunk (state=3): >>><<< 40074 1727204635.97442: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204635.97444: _low_level_execute_command(): starting 40074 1727204635.97446: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282 `" && echo ansible-tmp-1727204635.97324-41346-63302699079282="` echo /root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282 `" ) && sleep 0' 40074 1727204635.98114: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204635.98118: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204635.98122: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204635.98131: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204635.98133: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204635.98210: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204635.98214: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204635.98256: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204636.00365: stdout chunk (state=3): >>>ansible-tmp-1727204635.97324-41346-63302699079282=/root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282 <<< 40074 1727204636.00729: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204636.00732: stdout chunk (state=3): >>><<< 40074 1727204636.00735: stderr chunk (state=3): >>><<< 40074 1727204636.00738: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204635.97324-41346-63302699079282=/root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204636.00740: variable 'ansible_module_compression' from source: unknown 40074 1727204636.00820: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 40074 1727204636.01020: variable 'ansible_facts' from source: unknown 40074 1727204636.01479: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282/AnsiballZ_package_facts.py 40074 1727204636.01884: Sending initial data 40074 1727204636.01894: Sent initial data (159 bytes) 40074 1727204636.02330: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204636.02362: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204636.02426: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204636.02429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204636.02469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204636.04209: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 40074 1727204636.04216: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204636.04244: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204636.04277: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp4s1gpxdr /root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282/AnsiballZ_package_facts.py <<< 40074 1727204636.04284: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282/AnsiballZ_package_facts.py" <<< 40074 1727204636.04316: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 40074 1727204636.04324: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp4s1gpxdr" to remote "/root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282/AnsiballZ_package_facts.py" <<< 40074 1727204636.06915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204636.06982: stderr chunk (state=3): >>><<< 40074 1727204636.06985: stdout chunk (state=3): >>><<< 40074 1727204636.07010: done transferring module to remote 40074 1727204636.07023: _low_level_execute_command(): starting 40074 1727204636.07029: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282/ /root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282/AnsiballZ_package_facts.py && sleep 0' 40074 1727204636.07468: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204636.07472: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204636.07474: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204636.07477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204636.07536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204636.07544: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204636.07579: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204636.09950: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204636.09954: stderr chunk (state=3): >>><<< 40074 1727204636.09957: stdout chunk (state=3): >>><<< 40074 1727204636.10100: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204636.10104: _low_level_execute_command(): starting 40074 1727204636.10107: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282/AnsiballZ_package_facts.py && sleep 0' 40074 1727204636.11210: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204636.11506: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 40074 1727204636.11518: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204636.11535: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204636.11816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204636.76776: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 40074 1727204636.76801: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 40074 1727204636.76830: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 40074 1727204636.76873: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 40074 1727204636.76893: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 40074 1727204636.76905: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": n<<< 40074 1727204636.76922: stdout chunk (state=3): >>>ull, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5<<< 40074 1727204636.76968: stdout chunk (state=3): >>>", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name"<<< 40074 1727204636.76976: stdout chunk (state=3): >>>: "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 40074 1727204636.76987: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 40074 1727204636.77011: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 40074 1727204636.77030: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 40074 1727204636.79076: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204636.79147: stderr chunk (state=3): >>><<< 40074 1727204636.79151: stdout chunk (state=3): >>><<< 40074 1727204636.79194: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204636.85747: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204636.85762: _low_level_execute_command(): starting 40074 1727204636.85767: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204635.97324-41346-63302699079282/ > /dev/null 2>&1 && sleep 0' 40074 1727204636.86279: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204636.86283: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204636.86286: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204636.86290: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204636.86293: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204636.86347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204636.86352: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204636.86410: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204636.88398: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204636.88454: stderr chunk (state=3): >>><<< 40074 1727204636.88457: stdout chunk (state=3): >>><<< 40074 1727204636.88473: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204636.88481: handler run complete 40074 1727204636.89306: variable 'ansible_facts' from source: unknown 40074 1727204636.89739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204636.91718: variable 'ansible_facts' from source: unknown 40074 1727204636.92198: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204636.92986: attempt loop complete, returning result 40074 1727204636.93010: _execute() done 40074 1727204636.93013: dumping result to json 40074 1727204636.93193: done dumping result, returning 40074 1727204636.93202: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-9fd7-2501-000000000569] 40074 1727204636.93205: sending task result for task 12b410aa-8751-9fd7-2501-000000000569 40074 1727204636.98709: done sending task result for task 12b410aa-8751-9fd7-2501-000000000569 40074 1727204636.98713: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204636.98767: no more pending results, returning what we have 40074 1727204636.98769: results queue empty 40074 1727204636.98770: checking for any_errors_fatal 40074 1727204636.98773: done checking for any_errors_fatal 40074 1727204636.98774: checking for max_fail_percentage 40074 1727204636.98775: done checking for max_fail_percentage 40074 1727204636.98775: checking to see if all hosts have failed and the running result is not ok 40074 1727204636.98776: done checking to see if all hosts have failed 40074 1727204636.98776: getting the remaining hosts for this loop 40074 1727204636.98777: done getting the remaining hosts for this loop 40074 1727204636.98780: getting the next task for host managed-node2 40074 1727204636.98783: done getting next task for host managed-node2 40074 1727204636.98786: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 40074 1727204636.98787: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204636.98796: getting variables 40074 1727204636.98797: in VariableManager get_vars() 40074 1727204636.98815: Calling all_inventory to load vars for managed-node2 40074 1727204636.98817: Calling groups_inventory to load vars for managed-node2 40074 1727204636.98820: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204636.98824: Calling all_plugins_play to load vars for managed-node2 40074 1727204636.98826: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204636.98830: Calling groups_plugins_play to load vars for managed-node2 40074 1727204636.99962: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204637.01561: done with get_vars() 40074 1727204637.01582: done getting variables 40074 1727204637.01626: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:03:57 -0400 (0:00:01.100) 0:00:30.778 ***** 40074 1727204637.01650: entering _queue_task() for managed-node2/debug 40074 1727204637.01939: worker is 1 (out of 1 available) 40074 1727204637.01955: exiting _queue_task() for managed-node2/debug 40074 1727204637.01969: done queuing things up, now waiting for results queue to drain 40074 1727204637.01970: waiting for pending results... 40074 1727204637.02171: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 40074 1727204637.02283: in run() - task 12b410aa-8751-9fd7-2501-00000000006d 40074 1727204637.02299: variable 'ansible_search_path' from source: unknown 40074 1727204637.02304: variable 'ansible_search_path' from source: unknown 40074 1727204637.02340: calling self._execute() 40074 1727204637.02434: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204637.02441: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204637.02451: variable 'omit' from source: magic vars 40074 1727204637.02787: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.02800: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204637.02806: variable 'omit' from source: magic vars 40074 1727204637.02860: variable 'omit' from source: magic vars 40074 1727204637.02947: variable 'network_provider' from source: set_fact 40074 1727204637.02967: variable 'omit' from source: magic vars 40074 1727204637.03008: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204637.03040: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204637.03057: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204637.03074: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204637.03094: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204637.03126: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204637.03130: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204637.03133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204637.03231: Set connection var ansible_pipelining to False 40074 1727204637.03237: Set connection var ansible_shell_executable to /bin/sh 40074 1727204637.03240: Set connection var ansible_shell_type to sh 40074 1727204637.03244: Set connection var ansible_connection to ssh 40074 1727204637.03251: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204637.03257: Set connection var ansible_timeout to 10 40074 1727204637.03279: variable 'ansible_shell_executable' from source: unknown 40074 1727204637.03283: variable 'ansible_connection' from source: unknown 40074 1727204637.03286: variable 'ansible_module_compression' from source: unknown 40074 1727204637.03291: variable 'ansible_shell_type' from source: unknown 40074 1727204637.03294: variable 'ansible_shell_executable' from source: unknown 40074 1727204637.03296: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204637.03306: variable 'ansible_pipelining' from source: unknown 40074 1727204637.03311: variable 'ansible_timeout' from source: unknown 40074 1727204637.03314: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204637.03438: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204637.03449: variable 'omit' from source: magic vars 40074 1727204637.03455: starting attempt loop 40074 1727204637.03458: running the handler 40074 1727204637.03499: handler run complete 40074 1727204637.03511: attempt loop complete, returning result 40074 1727204637.03515: _execute() done 40074 1727204637.03519: dumping result to json 40074 1727204637.03529: done dumping result, returning 40074 1727204637.03541: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-9fd7-2501-00000000006d] 40074 1727204637.03544: sending task result for task 12b410aa-8751-9fd7-2501-00000000006d 40074 1727204637.03635: done sending task result for task 12b410aa-8751-9fd7-2501-00000000006d 40074 1727204637.03638: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 40074 1727204637.03709: no more pending results, returning what we have 40074 1727204637.03712: results queue empty 40074 1727204637.03713: checking for any_errors_fatal 40074 1727204637.03727: done checking for any_errors_fatal 40074 1727204637.03728: checking for max_fail_percentage 40074 1727204637.03729: done checking for max_fail_percentage 40074 1727204637.03730: checking to see if all hosts have failed and the running result is not ok 40074 1727204637.03732: done checking to see if all hosts have failed 40074 1727204637.03733: getting the remaining hosts for this loop 40074 1727204637.03734: done getting the remaining hosts for this loop 40074 1727204637.03739: getting the next task for host managed-node2 40074 1727204637.03744: done getting next task for host managed-node2 40074 1727204637.03750: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 40074 1727204637.03753: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204637.03765: getting variables 40074 1727204637.03766: in VariableManager get_vars() 40074 1727204637.03815: Calling all_inventory to load vars for managed-node2 40074 1727204637.03818: Calling groups_inventory to load vars for managed-node2 40074 1727204637.03821: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204637.03830: Calling all_plugins_play to load vars for managed-node2 40074 1727204637.03833: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204637.03836: Calling groups_plugins_play to load vars for managed-node2 40074 1727204637.05083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204637.06803: done with get_vars() 40074 1727204637.06827: done getting variables 40074 1727204637.06877: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.052) 0:00:30.830 ***** 40074 1727204637.06907: entering _queue_task() for managed-node2/fail 40074 1727204637.07171: worker is 1 (out of 1 available) 40074 1727204637.07186: exiting _queue_task() for managed-node2/fail 40074 1727204637.07201: done queuing things up, now waiting for results queue to drain 40074 1727204637.07202: waiting for pending results... 40074 1727204637.07394: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 40074 1727204637.07504: in run() - task 12b410aa-8751-9fd7-2501-00000000006e 40074 1727204637.07519: variable 'ansible_search_path' from source: unknown 40074 1727204637.07524: variable 'ansible_search_path' from source: unknown 40074 1727204637.07560: calling self._execute() 40074 1727204637.07645: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204637.07649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204637.07666: variable 'omit' from source: magic vars 40074 1727204637.08005: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.08016: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204637.08124: variable 'network_state' from source: role '' defaults 40074 1727204637.08133: Evaluated conditional (network_state != {}): False 40074 1727204637.08136: when evaluation is False, skipping this task 40074 1727204637.08140: _execute() done 40074 1727204637.08145: dumping result to json 40074 1727204637.08148: done dumping result, returning 40074 1727204637.08156: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-9fd7-2501-00000000006e] 40074 1727204637.08160: sending task result for task 12b410aa-8751-9fd7-2501-00000000006e 40074 1727204637.08258: done sending task result for task 12b410aa-8751-9fd7-2501-00000000006e 40074 1727204637.08262: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204637.08324: no more pending results, returning what we have 40074 1727204637.08328: results queue empty 40074 1727204637.08329: checking for any_errors_fatal 40074 1727204637.08338: done checking for any_errors_fatal 40074 1727204637.08339: checking for max_fail_percentage 40074 1727204637.08341: done checking for max_fail_percentage 40074 1727204637.08342: checking to see if all hosts have failed and the running result is not ok 40074 1727204637.08344: done checking to see if all hosts have failed 40074 1727204637.08345: getting the remaining hosts for this loop 40074 1727204637.08346: done getting the remaining hosts for this loop 40074 1727204637.08351: getting the next task for host managed-node2 40074 1727204637.08358: done getting next task for host managed-node2 40074 1727204637.08362: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 40074 1727204637.08365: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204637.08393: getting variables 40074 1727204637.08395: in VariableManager get_vars() 40074 1727204637.08436: Calling all_inventory to load vars for managed-node2 40074 1727204637.08438: Calling groups_inventory to load vars for managed-node2 40074 1727204637.08441: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204637.08451: Calling all_plugins_play to load vars for managed-node2 40074 1727204637.08454: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204637.08457: Calling groups_plugins_play to load vars for managed-node2 40074 1727204637.09706: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204637.11353: done with get_vars() 40074 1727204637.11376: done getting variables 40074 1727204637.11431: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.045) 0:00:30.876 ***** 40074 1727204637.11459: entering _queue_task() for managed-node2/fail 40074 1727204637.11731: worker is 1 (out of 1 available) 40074 1727204637.11745: exiting _queue_task() for managed-node2/fail 40074 1727204637.11759: done queuing things up, now waiting for results queue to drain 40074 1727204637.11761: waiting for pending results... 40074 1727204637.11955: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 40074 1727204637.12071: in run() - task 12b410aa-8751-9fd7-2501-00000000006f 40074 1727204637.12085: variable 'ansible_search_path' from source: unknown 40074 1727204637.12090: variable 'ansible_search_path' from source: unknown 40074 1727204637.12128: calling self._execute() 40074 1727204637.12224: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204637.12228: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204637.12236: variable 'omit' from source: magic vars 40074 1727204637.12563: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.12573: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204637.12678: variable 'network_state' from source: role '' defaults 40074 1727204637.12688: Evaluated conditional (network_state != {}): False 40074 1727204637.12693: when evaluation is False, skipping this task 40074 1727204637.12698: _execute() done 40074 1727204637.12702: dumping result to json 40074 1727204637.12707: done dumping result, returning 40074 1727204637.12715: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-9fd7-2501-00000000006f] 40074 1727204637.12760: sending task result for task 12b410aa-8751-9fd7-2501-00000000006f 40074 1727204637.12836: done sending task result for task 12b410aa-8751-9fd7-2501-00000000006f 40074 1727204637.12839: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204637.12909: no more pending results, returning what we have 40074 1727204637.12913: results queue empty 40074 1727204637.12914: checking for any_errors_fatal 40074 1727204637.12922: done checking for any_errors_fatal 40074 1727204637.12923: checking for max_fail_percentage 40074 1727204637.12925: done checking for max_fail_percentage 40074 1727204637.12926: checking to see if all hosts have failed and the running result is not ok 40074 1727204637.12927: done checking to see if all hosts have failed 40074 1727204637.12928: getting the remaining hosts for this loop 40074 1727204637.12929: done getting the remaining hosts for this loop 40074 1727204637.12933: getting the next task for host managed-node2 40074 1727204637.12939: done getting next task for host managed-node2 40074 1727204637.12943: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 40074 1727204637.12946: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204637.12967: getting variables 40074 1727204637.12969: in VariableManager get_vars() 40074 1727204637.13009: Calling all_inventory to load vars for managed-node2 40074 1727204637.13012: Calling groups_inventory to load vars for managed-node2 40074 1727204637.13021: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204637.13029: Calling all_plugins_play to load vars for managed-node2 40074 1727204637.13032: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204637.13034: Calling groups_plugins_play to load vars for managed-node2 40074 1727204637.15325: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204637.18595: done with get_vars() 40074 1727204637.18647: done getting variables 40074 1727204637.18935: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.075) 0:00:30.951 ***** 40074 1727204637.18979: entering _queue_task() for managed-node2/fail 40074 1727204637.19556: worker is 1 (out of 1 available) 40074 1727204637.19569: exiting _queue_task() for managed-node2/fail 40074 1727204637.19582: done queuing things up, now waiting for results queue to drain 40074 1727204637.19584: waiting for pending results... 40074 1727204637.19821: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 40074 1727204637.19960: in run() - task 12b410aa-8751-9fd7-2501-000000000070 40074 1727204637.19973: variable 'ansible_search_path' from source: unknown 40074 1727204637.19977: variable 'ansible_search_path' from source: unknown 40074 1727204637.20015: calling self._execute() 40074 1727204637.20105: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204637.20113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204637.20126: variable 'omit' from source: magic vars 40074 1727204637.20469: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.20486: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204637.20640: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204637.22597: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204637.22602: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204637.22645: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204637.22695: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204637.22736: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204637.22842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.22903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.22944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.23016: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.23043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.23172: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.23198: Evaluated conditional (ansible_distribution_major_version | int > 9): True 40074 1727204637.23362: variable 'ansible_distribution' from source: facts 40074 1727204637.23373: variable '__network_rh_distros' from source: role '' defaults 40074 1727204637.23594: Evaluated conditional (ansible_distribution in __network_rh_distros): False 40074 1727204637.23598: when evaluation is False, skipping this task 40074 1727204637.23601: _execute() done 40074 1727204637.23603: dumping result to json 40074 1727204637.23607: done dumping result, returning 40074 1727204637.23611: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-9fd7-2501-000000000070] 40074 1727204637.23614: sending task result for task 12b410aa-8751-9fd7-2501-000000000070 40074 1727204637.23699: done sending task result for task 12b410aa-8751-9fd7-2501-000000000070 40074 1727204637.23703: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 40074 1727204637.23760: no more pending results, returning what we have 40074 1727204637.23764: results queue empty 40074 1727204637.23765: checking for any_errors_fatal 40074 1727204637.23775: done checking for any_errors_fatal 40074 1727204637.23776: checking for max_fail_percentage 40074 1727204637.23778: done checking for max_fail_percentage 40074 1727204637.23779: checking to see if all hosts have failed and the running result is not ok 40074 1727204637.23780: done checking to see if all hosts have failed 40074 1727204637.23781: getting the remaining hosts for this loop 40074 1727204637.23783: done getting the remaining hosts for this loop 40074 1727204637.23788: getting the next task for host managed-node2 40074 1727204637.23797: done getting next task for host managed-node2 40074 1727204637.23801: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 40074 1727204637.23805: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204637.23823: getting variables 40074 1727204637.23825: in VariableManager get_vars() 40074 1727204637.23869: Calling all_inventory to load vars for managed-node2 40074 1727204637.23872: Calling groups_inventory to load vars for managed-node2 40074 1727204637.23875: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204637.23885: Calling all_plugins_play to load vars for managed-node2 40074 1727204637.23888: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204637.23975: Calling groups_plugins_play to load vars for managed-node2 40074 1727204637.26205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204637.29483: done with get_vars() 40074 1727204637.29525: done getting variables 40074 1727204637.29606: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.106) 0:00:31.058 ***** 40074 1727204637.29650: entering _queue_task() for managed-node2/dnf 40074 1727204637.30059: worker is 1 (out of 1 available) 40074 1727204637.30075: exiting _queue_task() for managed-node2/dnf 40074 1727204637.30302: done queuing things up, now waiting for results queue to drain 40074 1727204637.30304: waiting for pending results... 40074 1727204637.30439: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 40074 1727204637.30627: in run() - task 12b410aa-8751-9fd7-2501-000000000071 40074 1727204637.30658: variable 'ansible_search_path' from source: unknown 40074 1727204637.30669: variable 'ansible_search_path' from source: unknown 40074 1727204637.30721: calling self._execute() 40074 1727204637.30851: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204637.30872: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204637.30895: variable 'omit' from source: magic vars 40074 1727204637.31362: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.31383: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204637.31666: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204637.34452: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204637.34514: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204637.34570: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204637.34624: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204637.34694: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204637.34781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.34828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.34865: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.34932: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.35094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.35098: variable 'ansible_distribution' from source: facts 40074 1727204637.35111: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.35128: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 40074 1727204637.35280: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204637.35479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.35520: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.35562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.35622: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.35650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.35709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.35747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.35787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.35846: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.35874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.35933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.36082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.36085: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.36090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.36093: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.36301: variable 'network_connections' from source: task vars 40074 1727204637.36325: variable 'interface1' from source: play vars 40074 1727204637.36422: variable 'interface1' from source: play vars 40074 1727204637.36531: variable 'interface1_mac' from source: set_fact 40074 1727204637.36645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204637.36862: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204637.36914: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204637.36965: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204637.37006: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204637.37070: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204637.37104: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204637.37152: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.37196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204637.37277: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204637.37694: variable 'network_connections' from source: task vars 40074 1727204637.37698: variable 'interface1' from source: play vars 40074 1727204637.37737: variable 'interface1' from source: play vars 40074 1727204637.37838: variable 'interface1_mac' from source: set_fact 40074 1727204637.37891: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 40074 1727204637.37902: when evaluation is False, skipping this task 40074 1727204637.37911: _execute() done 40074 1727204637.37922: dumping result to json 40074 1727204637.37936: done dumping result, returning 40074 1727204637.37950: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-9fd7-2501-000000000071] 40074 1727204637.38042: sending task result for task 12b410aa-8751-9fd7-2501-000000000071 40074 1727204637.38124: done sending task result for task 12b410aa-8751-9fd7-2501-000000000071 40074 1727204637.38127: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 40074 1727204637.38206: no more pending results, returning what we have 40074 1727204637.38210: results queue empty 40074 1727204637.38211: checking for any_errors_fatal 40074 1727204637.38223: done checking for any_errors_fatal 40074 1727204637.38225: checking for max_fail_percentage 40074 1727204637.38226: done checking for max_fail_percentage 40074 1727204637.38228: checking to see if all hosts have failed and the running result is not ok 40074 1727204637.38229: done checking to see if all hosts have failed 40074 1727204637.38230: getting the remaining hosts for this loop 40074 1727204637.38232: done getting the remaining hosts for this loop 40074 1727204637.38237: getting the next task for host managed-node2 40074 1727204637.38244: done getting next task for host managed-node2 40074 1727204637.38249: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 40074 1727204637.38252: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204637.38274: getting variables 40074 1727204637.38276: in VariableManager get_vars() 40074 1727204637.38329: Calling all_inventory to load vars for managed-node2 40074 1727204637.38332: Calling groups_inventory to load vars for managed-node2 40074 1727204637.38336: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204637.38349: Calling all_plugins_play to load vars for managed-node2 40074 1727204637.38353: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204637.38357: Calling groups_plugins_play to load vars for managed-node2 40074 1727204637.40909: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204637.43979: done with get_vars() 40074 1727204637.44025: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 40074 1727204637.44123: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.145) 0:00:31.203 ***** 40074 1727204637.44164: entering _queue_task() for managed-node2/yum 40074 1727204637.44562: worker is 1 (out of 1 available) 40074 1727204637.44578: exiting _queue_task() for managed-node2/yum 40074 1727204637.44796: done queuing things up, now waiting for results queue to drain 40074 1727204637.44798: waiting for pending results... 40074 1727204637.45008: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 40074 1727204637.45123: in run() - task 12b410aa-8751-9fd7-2501-000000000072 40074 1727204637.45155: variable 'ansible_search_path' from source: unknown 40074 1727204637.45165: variable 'ansible_search_path' from source: unknown 40074 1727204637.45214: calling self._execute() 40074 1727204637.45340: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204637.45395: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204637.45400: variable 'omit' from source: magic vars 40074 1727204637.45853: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.45872: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204637.46128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204637.48320: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204637.48379: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204637.48413: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204637.48447: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204637.48473: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204637.48547: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.48877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.48909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.48944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.48959: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.49046: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.49060: Evaluated conditional (ansible_distribution_major_version | int < 8): False 40074 1727204637.49063: when evaluation is False, skipping this task 40074 1727204637.49067: _execute() done 40074 1727204637.49072: dumping result to json 40074 1727204637.49076: done dumping result, returning 40074 1727204637.49084: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-9fd7-2501-000000000072] 40074 1727204637.49092: sending task result for task 12b410aa-8751-9fd7-2501-000000000072 40074 1727204637.49199: done sending task result for task 12b410aa-8751-9fd7-2501-000000000072 40074 1727204637.49201: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 40074 1727204637.49262: no more pending results, returning what we have 40074 1727204637.49266: results queue empty 40074 1727204637.49267: checking for any_errors_fatal 40074 1727204637.49276: done checking for any_errors_fatal 40074 1727204637.49276: checking for max_fail_percentage 40074 1727204637.49279: done checking for max_fail_percentage 40074 1727204637.49280: checking to see if all hosts have failed and the running result is not ok 40074 1727204637.49281: done checking to see if all hosts have failed 40074 1727204637.49282: getting the remaining hosts for this loop 40074 1727204637.49283: done getting the remaining hosts for this loop 40074 1727204637.49288: getting the next task for host managed-node2 40074 1727204637.49297: done getting next task for host managed-node2 40074 1727204637.49302: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 40074 1727204637.49305: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204637.49327: getting variables 40074 1727204637.49329: in VariableManager get_vars() 40074 1727204637.49374: Calling all_inventory to load vars for managed-node2 40074 1727204637.49377: Calling groups_inventory to load vars for managed-node2 40074 1727204637.49380: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204637.49400: Calling all_plugins_play to load vars for managed-node2 40074 1727204637.49404: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204637.49408: Calling groups_plugins_play to load vars for managed-node2 40074 1727204637.51440: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204637.53059: done with get_vars() 40074 1727204637.53083: done getting variables 40074 1727204637.53137: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.090) 0:00:31.293 ***** 40074 1727204637.53170: entering _queue_task() for managed-node2/fail 40074 1727204637.53443: worker is 1 (out of 1 available) 40074 1727204637.53457: exiting _queue_task() for managed-node2/fail 40074 1727204637.53471: done queuing things up, now waiting for results queue to drain 40074 1727204637.53473: waiting for pending results... 40074 1727204637.53681: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 40074 1727204637.53827: in run() - task 12b410aa-8751-9fd7-2501-000000000073 40074 1727204637.53832: variable 'ansible_search_path' from source: unknown 40074 1727204637.53835: variable 'ansible_search_path' from source: unknown 40074 1727204637.53995: calling self._execute() 40074 1727204637.53999: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204637.54001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204637.54004: variable 'omit' from source: magic vars 40074 1727204637.54435: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.54454: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204637.54610: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204637.54866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204637.57427: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204637.57515: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204637.57565: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204637.57630: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204637.57666: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204637.57764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.57823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.57860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.57920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.57943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.58007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.58041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.58076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.58294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.58299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.58302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.58305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.58308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.58328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.58354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.58574: variable 'network_connections' from source: task vars 40074 1727204637.58595: variable 'interface1' from source: play vars 40074 1727204637.58691: variable 'interface1' from source: play vars 40074 1727204637.58796: variable 'interface1_mac' from source: set_fact 40074 1727204637.58902: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204637.59106: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204637.59158: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204637.59202: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204637.59242: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204637.59299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204637.59332: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204637.59367: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.59408: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204637.59482: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204637.59823: variable 'network_connections' from source: task vars 40074 1727204637.59835: variable 'interface1' from source: play vars 40074 1727204637.59916: variable 'interface1' from source: play vars 40074 1727204637.60012: variable 'interface1_mac' from source: set_fact 40074 1727204637.60066: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 40074 1727204637.60076: when evaluation is False, skipping this task 40074 1727204637.60085: _execute() done 40074 1727204637.60097: dumping result to json 40074 1727204637.60107: done dumping result, returning 40074 1727204637.60121: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-9fd7-2501-000000000073] 40074 1727204637.60304: sending task result for task 12b410aa-8751-9fd7-2501-000000000073 40074 1727204637.60383: done sending task result for task 12b410aa-8751-9fd7-2501-000000000073 40074 1727204637.60387: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 40074 1727204637.60450: no more pending results, returning what we have 40074 1727204637.60454: results queue empty 40074 1727204637.60456: checking for any_errors_fatal 40074 1727204637.60463: done checking for any_errors_fatal 40074 1727204637.60464: checking for max_fail_percentage 40074 1727204637.60466: done checking for max_fail_percentage 40074 1727204637.60468: checking to see if all hosts have failed and the running result is not ok 40074 1727204637.60469: done checking to see if all hosts have failed 40074 1727204637.60470: getting the remaining hosts for this loop 40074 1727204637.60471: done getting the remaining hosts for this loop 40074 1727204637.60476: getting the next task for host managed-node2 40074 1727204637.60486: done getting next task for host managed-node2 40074 1727204637.60492: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 40074 1727204637.60495: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204637.60518: getting variables 40074 1727204637.60520: in VariableManager get_vars() 40074 1727204637.60570: Calling all_inventory to load vars for managed-node2 40074 1727204637.60574: Calling groups_inventory to load vars for managed-node2 40074 1727204637.60577: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204637.60793: Calling all_plugins_play to load vars for managed-node2 40074 1727204637.60799: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204637.60804: Calling groups_plugins_play to load vars for managed-node2 40074 1727204637.63055: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204637.66192: done with get_vars() 40074 1727204637.66228: done getting variables 40074 1727204637.66301: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.131) 0:00:31.425 ***** 40074 1727204637.66341: entering _queue_task() for managed-node2/package 40074 1727204637.66704: worker is 1 (out of 1 available) 40074 1727204637.66718: exiting _queue_task() for managed-node2/package 40074 1727204637.66731: done queuing things up, now waiting for results queue to drain 40074 1727204637.66732: waiting for pending results... 40074 1727204637.67042: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 40074 1727204637.67396: in run() - task 12b410aa-8751-9fd7-2501-000000000074 40074 1727204637.67399: variable 'ansible_search_path' from source: unknown 40074 1727204637.67402: variable 'ansible_search_path' from source: unknown 40074 1727204637.67405: calling self._execute() 40074 1727204637.67407: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204637.67422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204637.67440: variable 'omit' from source: magic vars 40074 1727204637.67897: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.67916: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204637.68185: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204637.68509: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204637.68566: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204637.68617: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204637.68701: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204637.68848: variable 'network_packages' from source: role '' defaults 40074 1727204637.68987: variable '__network_provider_setup' from source: role '' defaults 40074 1727204637.69008: variable '__network_service_name_default_nm' from source: role '' defaults 40074 1727204637.69103: variable '__network_service_name_default_nm' from source: role '' defaults 40074 1727204637.69119: variable '__network_packages_default_nm' from source: role '' defaults 40074 1727204637.69202: variable '__network_packages_default_nm' from source: role '' defaults 40074 1727204637.69475: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204637.71907: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204637.72195: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204637.72198: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204637.72201: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204637.72203: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204637.72218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.72257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.72295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.72355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.72377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.72442: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.72475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.72510: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.72567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.72588: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.72903: variable '__network_packages_default_gobject_packages' from source: role '' defaults 40074 1727204637.73053: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.73094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.73129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.73187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.73212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.73326: variable 'ansible_python' from source: facts 40074 1727204637.73362: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 40074 1727204637.73471: variable '__network_wpa_supplicant_required' from source: role '' defaults 40074 1727204637.73578: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 40074 1727204637.73765: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.73802: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.73951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.73954: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.73957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.73986: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.74030: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.74071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.74125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.74150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.74342: variable 'network_connections' from source: task vars 40074 1727204637.74358: variable 'interface1' from source: play vars 40074 1727204637.74487: variable 'interface1' from source: play vars 40074 1727204637.74643: variable 'interface1_mac' from source: set_fact 40074 1727204637.74753: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204637.74798: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204637.74846: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.74897: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204637.74964: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204637.75297: variable 'network_connections' from source: task vars 40074 1727204637.75303: variable 'interface1' from source: play vars 40074 1727204637.75388: variable 'interface1' from source: play vars 40074 1727204637.75498: variable 'interface1_mac' from source: set_fact 40074 1727204637.75555: variable '__network_packages_default_wireless' from source: role '' defaults 40074 1727204637.75626: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204637.75878: variable 'network_connections' from source: task vars 40074 1727204637.75883: variable 'interface1' from source: play vars 40074 1727204637.75943: variable 'interface1' from source: play vars 40074 1727204637.76013: variable 'interface1_mac' from source: set_fact 40074 1727204637.76042: variable '__network_packages_default_team' from source: role '' defaults 40074 1727204637.76109: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204637.76364: variable 'network_connections' from source: task vars 40074 1727204637.76368: variable 'interface1' from source: play vars 40074 1727204637.76428: variable 'interface1' from source: play vars 40074 1727204637.76512: variable 'interface1_mac' from source: set_fact 40074 1727204637.76565: variable '__network_service_name_default_initscripts' from source: role '' defaults 40074 1727204637.76621: variable '__network_service_name_default_initscripts' from source: role '' defaults 40074 1727204637.76631: variable '__network_packages_default_initscripts' from source: role '' defaults 40074 1727204637.76687: variable '__network_packages_default_initscripts' from source: role '' defaults 40074 1727204637.76870: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 40074 1727204637.77277: variable 'network_connections' from source: task vars 40074 1727204637.77281: variable 'interface1' from source: play vars 40074 1727204637.77339: variable 'interface1' from source: play vars 40074 1727204637.77399: variable 'interface1_mac' from source: set_fact 40074 1727204637.77412: variable 'ansible_distribution' from source: facts 40074 1727204637.77416: variable '__network_rh_distros' from source: role '' defaults 40074 1727204637.77426: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.77447: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 40074 1727204637.77671: variable 'ansible_distribution' from source: facts 40074 1727204637.77711: variable '__network_rh_distros' from source: role '' defaults 40074 1727204637.77714: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.77717: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 40074 1727204637.77938: variable 'ansible_distribution' from source: facts 40074 1727204637.78094: variable '__network_rh_distros' from source: role '' defaults 40074 1727204637.78097: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.78100: variable 'network_provider' from source: set_fact 40074 1727204637.78102: variable 'ansible_facts' from source: unknown 40074 1727204637.79226: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 40074 1727204637.79235: when evaluation is False, skipping this task 40074 1727204637.79244: _execute() done 40074 1727204637.79253: dumping result to json 40074 1727204637.79260: done dumping result, returning 40074 1727204637.79273: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-9fd7-2501-000000000074] 40074 1727204637.79283: sending task result for task 12b410aa-8751-9fd7-2501-000000000074 40074 1727204637.79496: done sending task result for task 12b410aa-8751-9fd7-2501-000000000074 40074 1727204637.79500: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 40074 1727204637.79561: no more pending results, returning what we have 40074 1727204637.79566: results queue empty 40074 1727204637.79567: checking for any_errors_fatal 40074 1727204637.79578: done checking for any_errors_fatal 40074 1727204637.79579: checking for max_fail_percentage 40074 1727204637.79581: done checking for max_fail_percentage 40074 1727204637.79583: checking to see if all hosts have failed and the running result is not ok 40074 1727204637.79584: done checking to see if all hosts have failed 40074 1727204637.79585: getting the remaining hosts for this loop 40074 1727204637.79587: done getting the remaining hosts for this loop 40074 1727204637.79594: getting the next task for host managed-node2 40074 1727204637.79603: done getting next task for host managed-node2 40074 1727204637.79608: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 40074 1727204637.79611: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204637.79634: getting variables 40074 1727204637.79636: in VariableManager get_vars() 40074 1727204637.79688: Calling all_inventory to load vars for managed-node2 40074 1727204637.79875: Calling groups_inventory to load vars for managed-node2 40074 1727204637.79880: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204637.79904: Calling all_plugins_play to load vars for managed-node2 40074 1727204637.79908: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204637.79911: Calling groups_plugins_play to load vars for managed-node2 40074 1727204637.81242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204637.83108: done with get_vars() 40074 1727204637.83159: done getting variables 40074 1727204637.83247: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.169) 0:00:31.594 ***** 40074 1727204637.83285: entering _queue_task() for managed-node2/package 40074 1727204637.83710: worker is 1 (out of 1 available) 40074 1727204637.83725: exiting _queue_task() for managed-node2/package 40074 1727204637.83740: done queuing things up, now waiting for results queue to drain 40074 1727204637.83741: waiting for pending results... 40074 1727204637.83953: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 40074 1727204637.84065: in run() - task 12b410aa-8751-9fd7-2501-000000000075 40074 1727204637.84079: variable 'ansible_search_path' from source: unknown 40074 1727204637.84082: variable 'ansible_search_path' from source: unknown 40074 1727204637.84122: calling self._execute() 40074 1727204637.84215: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204637.84222: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204637.84234: variable 'omit' from source: magic vars 40074 1727204637.84564: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.84576: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204637.84684: variable 'network_state' from source: role '' defaults 40074 1727204637.84696: Evaluated conditional (network_state != {}): False 40074 1727204637.84699: when evaluation is False, skipping this task 40074 1727204637.84704: _execute() done 40074 1727204637.84709: dumping result to json 40074 1727204637.84713: done dumping result, returning 40074 1727204637.84735: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-9fd7-2501-000000000075] 40074 1727204637.84738: sending task result for task 12b410aa-8751-9fd7-2501-000000000075 40074 1727204637.84838: done sending task result for task 12b410aa-8751-9fd7-2501-000000000075 40074 1727204637.84840: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204637.84895: no more pending results, returning what we have 40074 1727204637.84900: results queue empty 40074 1727204637.84901: checking for any_errors_fatal 40074 1727204637.84911: done checking for any_errors_fatal 40074 1727204637.84912: checking for max_fail_percentage 40074 1727204637.84913: done checking for max_fail_percentage 40074 1727204637.84914: checking to see if all hosts have failed and the running result is not ok 40074 1727204637.84915: done checking to see if all hosts have failed 40074 1727204637.84918: getting the remaining hosts for this loop 40074 1727204637.84920: done getting the remaining hosts for this loop 40074 1727204637.84924: getting the next task for host managed-node2 40074 1727204637.84931: done getting next task for host managed-node2 40074 1727204637.84935: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 40074 1727204637.84938: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204637.84959: getting variables 40074 1727204637.84961: in VariableManager get_vars() 40074 1727204637.85012: Calling all_inventory to load vars for managed-node2 40074 1727204637.85016: Calling groups_inventory to load vars for managed-node2 40074 1727204637.85021: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204637.85031: Calling all_plugins_play to load vars for managed-node2 40074 1727204637.85034: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204637.85037: Calling groups_plugins_play to load vars for managed-node2 40074 1727204637.86423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204637.88025: done with get_vars() 40074 1727204637.88052: done getting variables 40074 1727204637.88106: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.048) 0:00:31.643 ***** 40074 1727204637.88134: entering _queue_task() for managed-node2/package 40074 1727204637.88398: worker is 1 (out of 1 available) 40074 1727204637.88412: exiting _queue_task() for managed-node2/package 40074 1727204637.88426: done queuing things up, now waiting for results queue to drain 40074 1727204637.88428: waiting for pending results... 40074 1727204637.88628: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 40074 1727204637.88742: in run() - task 12b410aa-8751-9fd7-2501-000000000076 40074 1727204637.88756: variable 'ansible_search_path' from source: unknown 40074 1727204637.88762: variable 'ansible_search_path' from source: unknown 40074 1727204637.88796: calling self._execute() 40074 1727204637.88891: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204637.88898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204637.88908: variable 'omit' from source: magic vars 40074 1727204637.89234: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.89244: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204637.89352: variable 'network_state' from source: role '' defaults 40074 1727204637.89362: Evaluated conditional (network_state != {}): False 40074 1727204637.89365: when evaluation is False, skipping this task 40074 1727204637.89368: _execute() done 40074 1727204637.89374: dumping result to json 40074 1727204637.89377: done dumping result, returning 40074 1727204637.89385: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-9fd7-2501-000000000076] 40074 1727204637.89392: sending task result for task 12b410aa-8751-9fd7-2501-000000000076 40074 1727204637.89494: done sending task result for task 12b410aa-8751-9fd7-2501-000000000076 40074 1727204637.89497: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204637.89546: no more pending results, returning what we have 40074 1727204637.89551: results queue empty 40074 1727204637.89552: checking for any_errors_fatal 40074 1727204637.89562: done checking for any_errors_fatal 40074 1727204637.89563: checking for max_fail_percentage 40074 1727204637.89565: done checking for max_fail_percentage 40074 1727204637.89566: checking to see if all hosts have failed and the running result is not ok 40074 1727204637.89567: done checking to see if all hosts have failed 40074 1727204637.89568: getting the remaining hosts for this loop 40074 1727204637.89569: done getting the remaining hosts for this loop 40074 1727204637.89573: getting the next task for host managed-node2 40074 1727204637.89581: done getting next task for host managed-node2 40074 1727204637.89586: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 40074 1727204637.89590: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204637.89608: getting variables 40074 1727204637.89610: in VariableManager get_vars() 40074 1727204637.89648: Calling all_inventory to load vars for managed-node2 40074 1727204637.89651: Calling groups_inventory to load vars for managed-node2 40074 1727204637.89654: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204637.89663: Calling all_plugins_play to load vars for managed-node2 40074 1727204637.89666: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204637.89669: Calling groups_plugins_play to load vars for managed-node2 40074 1727204637.90888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204637.92596: done with get_vars() 40074 1727204637.92621: done getting variables 40074 1727204637.92671: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:03:57 -0400 (0:00:00.045) 0:00:31.688 ***** 40074 1727204637.92701: entering _queue_task() for managed-node2/service 40074 1727204637.92955: worker is 1 (out of 1 available) 40074 1727204637.92969: exiting _queue_task() for managed-node2/service 40074 1727204637.92983: done queuing things up, now waiting for results queue to drain 40074 1727204637.92985: waiting for pending results... 40074 1727204637.93182: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 40074 1727204637.93294: in run() - task 12b410aa-8751-9fd7-2501-000000000077 40074 1727204637.93306: variable 'ansible_search_path' from source: unknown 40074 1727204637.93311: variable 'ansible_search_path' from source: unknown 40074 1727204637.93348: calling self._execute() 40074 1727204637.93438: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204637.93444: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204637.93456: variable 'omit' from source: magic vars 40074 1727204637.93780: variable 'ansible_distribution_major_version' from source: facts 40074 1727204637.93792: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204637.93896: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204637.94064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204637.95847: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204637.95902: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204637.95934: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204637.95969: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204637.95993: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204637.96062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.96107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.96132: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.96166: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.96185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.96231: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.96251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.96273: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.96311: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.96327: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.96363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204637.96384: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204637.96410: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.96444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204637.96456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204637.96603: variable 'network_connections' from source: task vars 40074 1727204637.96619: variable 'interface1' from source: play vars 40074 1727204637.96679: variable 'interface1' from source: play vars 40074 1727204637.96752: variable 'interface1_mac' from source: set_fact 40074 1727204637.96825: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204637.96960: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204637.96994: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204637.97027: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204637.97054: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204637.97093: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204637.97111: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204637.97135: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204637.97162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204637.97216: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204637.97435: variable 'network_connections' from source: task vars 40074 1727204637.97439: variable 'interface1' from source: play vars 40074 1727204637.97494: variable 'interface1' from source: play vars 40074 1727204637.97556: variable 'interface1_mac' from source: set_fact 40074 1727204637.97589: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 40074 1727204637.97592: when evaluation is False, skipping this task 40074 1727204637.97601: _execute() done 40074 1727204637.97605: dumping result to json 40074 1727204637.97607: done dumping result, returning 40074 1727204637.97617: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-9fd7-2501-000000000077] 40074 1727204637.97627: sending task result for task 12b410aa-8751-9fd7-2501-000000000077 40074 1727204637.97721: done sending task result for task 12b410aa-8751-9fd7-2501-000000000077 40074 1727204637.97724: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 40074 1727204637.97773: no more pending results, returning what we have 40074 1727204637.97777: results queue empty 40074 1727204637.97778: checking for any_errors_fatal 40074 1727204637.97785: done checking for any_errors_fatal 40074 1727204637.97786: checking for max_fail_percentage 40074 1727204637.97788: done checking for max_fail_percentage 40074 1727204637.97791: checking to see if all hosts have failed and the running result is not ok 40074 1727204637.97792: done checking to see if all hosts have failed 40074 1727204637.97793: getting the remaining hosts for this loop 40074 1727204637.97795: done getting the remaining hosts for this loop 40074 1727204637.97799: getting the next task for host managed-node2 40074 1727204637.97807: done getting next task for host managed-node2 40074 1727204637.97811: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 40074 1727204637.97814: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204637.97838: getting variables 40074 1727204637.97839: in VariableManager get_vars() 40074 1727204637.97884: Calling all_inventory to load vars for managed-node2 40074 1727204637.97888: Calling groups_inventory to load vars for managed-node2 40074 1727204637.97902: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204637.97914: Calling all_plugins_play to load vars for managed-node2 40074 1727204637.97917: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204637.97920: Calling groups_plugins_play to load vars for managed-node2 40074 1727204637.99205: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204638.00843: done with get_vars() 40074 1727204638.00869: done getting variables 40074 1727204638.00924: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.082) 0:00:31.771 ***** 40074 1727204638.00955: entering _queue_task() for managed-node2/service 40074 1727204638.01232: worker is 1 (out of 1 available) 40074 1727204638.01248: exiting _queue_task() for managed-node2/service 40074 1727204638.01262: done queuing things up, now waiting for results queue to drain 40074 1727204638.01263: waiting for pending results... 40074 1727204638.01456: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 40074 1727204638.01565: in run() - task 12b410aa-8751-9fd7-2501-000000000078 40074 1727204638.01579: variable 'ansible_search_path' from source: unknown 40074 1727204638.01582: variable 'ansible_search_path' from source: unknown 40074 1727204638.01623: calling self._execute() 40074 1727204638.01710: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204638.01726: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204638.01730: variable 'omit' from source: magic vars 40074 1727204638.02052: variable 'ansible_distribution_major_version' from source: facts 40074 1727204638.02065: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204638.02210: variable 'network_provider' from source: set_fact 40074 1727204638.02214: variable 'network_state' from source: role '' defaults 40074 1727204638.02226: Evaluated conditional (network_provider == "nm" or network_state != {}): True 40074 1727204638.02232: variable 'omit' from source: magic vars 40074 1727204638.02286: variable 'omit' from source: magic vars 40074 1727204638.02313: variable 'network_service_name' from source: role '' defaults 40074 1727204638.02378: variable 'network_service_name' from source: role '' defaults 40074 1727204638.02470: variable '__network_provider_setup' from source: role '' defaults 40074 1727204638.02476: variable '__network_service_name_default_nm' from source: role '' defaults 40074 1727204638.02532: variable '__network_service_name_default_nm' from source: role '' defaults 40074 1727204638.02541: variable '__network_packages_default_nm' from source: role '' defaults 40074 1727204638.02594: variable '__network_packages_default_nm' from source: role '' defaults 40074 1727204638.02802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204638.04556: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204638.04610: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204638.04646: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204638.04969: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204638.05000: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204638.05069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204638.05099: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204638.05126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204638.05159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204638.05172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204638.05218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204638.05241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204638.05262: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204638.05293: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204638.05308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204638.05502: variable '__network_packages_default_gobject_packages' from source: role '' defaults 40074 1727204638.05605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204638.05629: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204638.05652: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204638.05685: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204638.05699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204638.05778: variable 'ansible_python' from source: facts 40074 1727204638.05800: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 40074 1727204638.05873: variable '__network_wpa_supplicant_required' from source: role '' defaults 40074 1727204638.05940: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 40074 1727204638.06052: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204638.06073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204638.06100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204638.06134: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204638.06146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204638.06189: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204638.06216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204638.06238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204638.06269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204638.06283: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204638.06398: variable 'network_connections' from source: task vars 40074 1727204638.06406: variable 'interface1' from source: play vars 40074 1727204638.06473: variable 'interface1' from source: play vars 40074 1727204638.06555: variable 'interface1_mac' from source: set_fact 40074 1727204638.06658: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204638.06805: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204638.06850: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204638.06893: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204638.06929: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204638.06983: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204638.07010: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204638.07039: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204638.07068: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204638.07114: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204638.07363: variable 'network_connections' from source: task vars 40074 1727204638.07371: variable 'interface1' from source: play vars 40074 1727204638.07440: variable 'interface1' from source: play vars 40074 1727204638.07519: variable 'interface1_mac' from source: set_fact 40074 1727204638.07566: variable '__network_packages_default_wireless' from source: role '' defaults 40074 1727204638.07696: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204638.07882: variable 'network_connections' from source: task vars 40074 1727204638.07886: variable 'interface1' from source: play vars 40074 1727204638.07955: variable 'interface1' from source: play vars 40074 1727204638.08027: variable 'interface1_mac' from source: set_fact 40074 1727204638.08051: variable '__network_packages_default_team' from source: role '' defaults 40074 1727204638.08120: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204638.08369: variable 'network_connections' from source: task vars 40074 1727204638.08374: variable 'interface1' from source: play vars 40074 1727204638.08440: variable 'interface1' from source: play vars 40074 1727204638.08514: variable 'interface1_mac' from source: set_fact 40074 1727204638.08568: variable '__network_service_name_default_initscripts' from source: role '' defaults 40074 1727204638.08625: variable '__network_service_name_default_initscripts' from source: role '' defaults 40074 1727204638.08632: variable '__network_packages_default_initscripts' from source: role '' defaults 40074 1727204638.08682: variable '__network_packages_default_initscripts' from source: role '' defaults 40074 1727204638.08871: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 40074 1727204638.09292: variable 'network_connections' from source: task vars 40074 1727204638.09297: variable 'interface1' from source: play vars 40074 1727204638.09350: variable 'interface1' from source: play vars 40074 1727204638.09413: variable 'interface1_mac' from source: set_fact 40074 1727204638.09428: variable 'ansible_distribution' from source: facts 40074 1727204638.09432: variable '__network_rh_distros' from source: role '' defaults 40074 1727204638.09439: variable 'ansible_distribution_major_version' from source: facts 40074 1727204638.09459: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 40074 1727204638.09612: variable 'ansible_distribution' from source: facts 40074 1727204638.09616: variable '__network_rh_distros' from source: role '' defaults 40074 1727204638.09624: variable 'ansible_distribution_major_version' from source: facts 40074 1727204638.09632: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 40074 1727204638.09775: variable 'ansible_distribution' from source: facts 40074 1727204638.09783: variable '__network_rh_distros' from source: role '' defaults 40074 1727204638.09791: variable 'ansible_distribution_major_version' from source: facts 40074 1727204638.09827: variable 'network_provider' from source: set_fact 40074 1727204638.09848: variable 'omit' from source: magic vars 40074 1727204638.09872: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204638.09937: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204638.09941: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204638.09943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204638.09956: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204638.09981: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204638.09984: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204638.09992: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204638.10078: Set connection var ansible_pipelining to False 40074 1727204638.10085: Set connection var ansible_shell_executable to /bin/sh 40074 1727204638.10088: Set connection var ansible_shell_type to sh 40074 1727204638.10093: Set connection var ansible_connection to ssh 40074 1727204638.10102: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204638.10109: Set connection var ansible_timeout to 10 40074 1727204638.10133: variable 'ansible_shell_executable' from source: unknown 40074 1727204638.10137: variable 'ansible_connection' from source: unknown 40074 1727204638.10140: variable 'ansible_module_compression' from source: unknown 40074 1727204638.10143: variable 'ansible_shell_type' from source: unknown 40074 1727204638.10147: variable 'ansible_shell_executable' from source: unknown 40074 1727204638.10150: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204638.10156: variable 'ansible_pipelining' from source: unknown 40074 1727204638.10159: variable 'ansible_timeout' from source: unknown 40074 1727204638.10170: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204638.10254: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204638.10265: variable 'omit' from source: magic vars 40074 1727204638.10273: starting attempt loop 40074 1727204638.10276: running the handler 40074 1727204638.10343: variable 'ansible_facts' from source: unknown 40074 1727204638.10973: _low_level_execute_command(): starting 40074 1727204638.10976: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204638.11526: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204638.11532: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204638.11596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204638.11602: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204638.11604: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204638.11650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204638.13438: stdout chunk (state=3): >>>/root <<< 40074 1727204638.13549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204638.13610: stderr chunk (state=3): >>><<< 40074 1727204638.13614: stdout chunk (state=3): >>><<< 40074 1727204638.13637: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204638.13649: _low_level_execute_command(): starting 40074 1727204638.13655: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933 `" && echo ansible-tmp-1727204638.1363635-41406-37757789510933="` echo /root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933 `" ) && sleep 0' 40074 1727204638.14203: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204638.14253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204638.14257: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204638.14305: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204638.14365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204638.16451: stdout chunk (state=3): >>>ansible-tmp-1727204638.1363635-41406-37757789510933=/root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933 <<< 40074 1727204638.16572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204638.16623: stderr chunk (state=3): >>><<< 40074 1727204638.16627: stdout chunk (state=3): >>><<< 40074 1727204638.16641: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204638.1363635-41406-37757789510933=/root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204638.16672: variable 'ansible_module_compression' from source: unknown 40074 1727204638.16722: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 40074 1727204638.16777: variable 'ansible_facts' from source: unknown 40074 1727204638.16922: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933/AnsiballZ_systemd.py 40074 1727204638.17040: Sending initial data 40074 1727204638.17043: Sent initial data (155 bytes) 40074 1727204638.17495: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204638.17499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204638.17506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204638.17509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204638.17562: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204638.17569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204638.17615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204638.19395: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204638.19398: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204638.19401: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpw5lc2fw8 /root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933/AnsiballZ_systemd.py <<< 40074 1727204638.19404: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933/AnsiballZ_systemd.py" <<< 40074 1727204638.19442: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpw5lc2fw8" to remote "/root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933/AnsiballZ_systemd.py" <<< 40074 1727204638.22842: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204638.22894: stderr chunk (state=3): >>><<< 40074 1727204638.22908: stdout chunk (state=3): >>><<< 40074 1727204638.22951: done transferring module to remote 40074 1727204638.22970: _low_level_execute_command(): starting 40074 1727204638.22988: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933/ /root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933/AnsiballZ_systemd.py && sleep 0' 40074 1727204638.23726: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204638.23748: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204638.23810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204638.23826: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204638.23937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204638.23941: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204638.23974: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204638.24101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204638.26097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204638.26482: stderr chunk (state=3): >>><<< 40074 1727204638.26486: stdout chunk (state=3): >>><<< 40074 1727204638.26490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204638.26495: _low_level_execute_command(): starting 40074 1727204638.26497: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933/AnsiballZ_systemd.py && sleep 0' 40074 1727204638.27573: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204638.27592: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204638.27609: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204638.27632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204638.27709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204638.27929: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204638.27947: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204638.28038: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204638.61601: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4583424", "MemoryAvailable": "infinity", "CPUUsageNSec": "2420446000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 40074 1727204638.61624: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target network.target network.service cloud-init.service NetworkManager-wait-online.service", "After": "systemd-journald.socket sysinit.target dbus.socket cloud-init-local.service system.slice network-pre.target basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": <<< 40074 1727204638.61640: stdout chunk (state=3): >>>"loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:49 EDT", "StateChangeTimestampMonotonic": "1013574884", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 40074 1727204638.63702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204638.63763: stderr chunk (state=3): >>><<< 40074 1727204638.63766: stdout chunk (state=3): >>><<< 40074 1727204638.63787: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4583424", "MemoryAvailable": "infinity", "CPUUsageNSec": "2420446000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target network.target network.service cloud-init.service NetworkManager-wait-online.service", "After": "systemd-journald.socket sysinit.target dbus.socket cloud-init-local.service system.slice network-pre.target basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:49 EDT", "StateChangeTimestampMonotonic": "1013574884", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204638.63956: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204638.63973: _low_level_execute_command(): starting 40074 1727204638.63978: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204638.1363635-41406-37757789510933/ > /dev/null 2>&1 && sleep 0' 40074 1727204638.64467: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204638.64471: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204638.64473: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204638.64476: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204638.64527: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204638.64531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204638.64581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204638.66558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204638.66608: stderr chunk (state=3): >>><<< 40074 1727204638.66612: stdout chunk (state=3): >>><<< 40074 1727204638.66631: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204638.66640: handler run complete 40074 1727204638.66687: attempt loop complete, returning result 40074 1727204638.66691: _execute() done 40074 1727204638.66694: dumping result to json 40074 1727204638.66710: done dumping result, returning 40074 1727204638.66719: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-9fd7-2501-000000000078] 40074 1727204638.66729: sending task result for task 12b410aa-8751-9fd7-2501-000000000078 40074 1727204638.66962: done sending task result for task 12b410aa-8751-9fd7-2501-000000000078 40074 1727204638.66965: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204638.67032: no more pending results, returning what we have 40074 1727204638.67036: results queue empty 40074 1727204638.67037: checking for any_errors_fatal 40074 1727204638.67046: done checking for any_errors_fatal 40074 1727204638.67047: checking for max_fail_percentage 40074 1727204638.67049: done checking for max_fail_percentage 40074 1727204638.67050: checking to see if all hosts have failed and the running result is not ok 40074 1727204638.67051: done checking to see if all hosts have failed 40074 1727204638.67052: getting the remaining hosts for this loop 40074 1727204638.67053: done getting the remaining hosts for this loop 40074 1727204638.67058: getting the next task for host managed-node2 40074 1727204638.67064: done getting next task for host managed-node2 40074 1727204638.67068: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 40074 1727204638.67071: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204638.67096: getting variables 40074 1727204638.67098: in VariableManager get_vars() 40074 1727204638.67142: Calling all_inventory to load vars for managed-node2 40074 1727204638.67145: Calling groups_inventory to load vars for managed-node2 40074 1727204638.67148: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204638.67159: Calling all_plugins_play to load vars for managed-node2 40074 1727204638.67162: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204638.67166: Calling groups_plugins_play to load vars for managed-node2 40074 1727204638.68584: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204638.70192: done with get_vars() 40074 1727204638.70215: done getting variables 40074 1727204638.70270: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.693) 0:00:32.464 ***** 40074 1727204638.70300: entering _queue_task() for managed-node2/service 40074 1727204638.70561: worker is 1 (out of 1 available) 40074 1727204638.70575: exiting _queue_task() for managed-node2/service 40074 1727204638.70591: done queuing things up, now waiting for results queue to drain 40074 1727204638.70593: waiting for pending results... 40074 1727204638.70803: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 40074 1727204638.70916: in run() - task 12b410aa-8751-9fd7-2501-000000000079 40074 1727204638.70934: variable 'ansible_search_path' from source: unknown 40074 1727204638.70938: variable 'ansible_search_path' from source: unknown 40074 1727204638.70972: calling self._execute() 40074 1727204638.71062: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204638.71069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204638.71080: variable 'omit' from source: magic vars 40074 1727204638.71414: variable 'ansible_distribution_major_version' from source: facts 40074 1727204638.71428: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204638.71533: variable 'network_provider' from source: set_fact 40074 1727204638.71537: Evaluated conditional (network_provider == "nm"): True 40074 1727204638.71624: variable '__network_wpa_supplicant_required' from source: role '' defaults 40074 1727204638.71700: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 40074 1727204638.71859: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204638.73568: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204638.73625: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204638.73657: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204638.73696: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204638.73723: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204638.73806: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204638.73832: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204638.73854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204638.73893: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204638.73907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204638.73948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204638.73970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204638.73999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204638.74029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204638.74042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204638.74076: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204638.74107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204638.74126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204638.74156: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204638.74168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204638.74290: variable 'network_connections' from source: task vars 40074 1727204638.74303: variable 'interface1' from source: play vars 40074 1727204638.74366: variable 'interface1' from source: play vars 40074 1727204638.74433: variable 'interface1_mac' from source: set_fact 40074 1727204638.74504: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204638.74639: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204638.74675: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204638.74703: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204638.74728: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204638.74771: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204638.74791: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204638.74811: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204638.74834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204638.74878: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204638.75099: variable 'network_connections' from source: task vars 40074 1727204638.75103: variable 'interface1' from source: play vars 40074 1727204638.75156: variable 'interface1' from source: play vars 40074 1727204638.75223: variable 'interface1_mac' from source: set_fact 40074 1727204638.75259: Evaluated conditional (__network_wpa_supplicant_required): False 40074 1727204638.75262: when evaluation is False, skipping this task 40074 1727204638.75266: _execute() done 40074 1727204638.75272: dumping result to json 40074 1727204638.75275: done dumping result, returning 40074 1727204638.75287: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-9fd7-2501-000000000079] 40074 1727204638.75290: sending task result for task 12b410aa-8751-9fd7-2501-000000000079 40074 1727204638.75386: done sending task result for task 12b410aa-8751-9fd7-2501-000000000079 40074 1727204638.75392: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 40074 1727204638.75452: no more pending results, returning what we have 40074 1727204638.75456: results queue empty 40074 1727204638.75457: checking for any_errors_fatal 40074 1727204638.75478: done checking for any_errors_fatal 40074 1727204638.75479: checking for max_fail_percentage 40074 1727204638.75481: done checking for max_fail_percentage 40074 1727204638.75482: checking to see if all hosts have failed and the running result is not ok 40074 1727204638.75483: done checking to see if all hosts have failed 40074 1727204638.75484: getting the remaining hosts for this loop 40074 1727204638.75485: done getting the remaining hosts for this loop 40074 1727204638.75492: getting the next task for host managed-node2 40074 1727204638.75499: done getting next task for host managed-node2 40074 1727204638.75503: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 40074 1727204638.75506: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204638.75526: getting variables 40074 1727204638.75528: in VariableManager get_vars() 40074 1727204638.75570: Calling all_inventory to load vars for managed-node2 40074 1727204638.75573: Calling groups_inventory to load vars for managed-node2 40074 1727204638.75576: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204638.75586: Calling all_plugins_play to load vars for managed-node2 40074 1727204638.75597: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204638.75602: Calling groups_plugins_play to load vars for managed-node2 40074 1727204638.76856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204638.78565: done with get_vars() 40074 1727204638.78592: done getting variables 40074 1727204638.78645: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.083) 0:00:32.548 ***** 40074 1727204638.78676: entering _queue_task() for managed-node2/service 40074 1727204638.78957: worker is 1 (out of 1 available) 40074 1727204638.78972: exiting _queue_task() for managed-node2/service 40074 1727204638.78986: done queuing things up, now waiting for results queue to drain 40074 1727204638.78988: waiting for pending results... 40074 1727204638.79198: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 40074 1727204638.79315: in run() - task 12b410aa-8751-9fd7-2501-00000000007a 40074 1727204638.79335: variable 'ansible_search_path' from source: unknown 40074 1727204638.79339: variable 'ansible_search_path' from source: unknown 40074 1727204638.79373: calling self._execute() 40074 1727204638.79464: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204638.79472: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204638.79483: variable 'omit' from source: magic vars 40074 1727204638.79816: variable 'ansible_distribution_major_version' from source: facts 40074 1727204638.79829: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204638.79933: variable 'network_provider' from source: set_fact 40074 1727204638.79938: Evaluated conditional (network_provider == "initscripts"): False 40074 1727204638.79942: when evaluation is False, skipping this task 40074 1727204638.79947: _execute() done 40074 1727204638.79953: dumping result to json 40074 1727204638.79956: done dumping result, returning 40074 1727204638.79965: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-9fd7-2501-00000000007a] 40074 1727204638.79970: sending task result for task 12b410aa-8751-9fd7-2501-00000000007a 40074 1727204638.80068: done sending task result for task 12b410aa-8751-9fd7-2501-00000000007a 40074 1727204638.80072: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204638.80130: no more pending results, returning what we have 40074 1727204638.80134: results queue empty 40074 1727204638.80135: checking for any_errors_fatal 40074 1727204638.80147: done checking for any_errors_fatal 40074 1727204638.80148: checking for max_fail_percentage 40074 1727204638.80150: done checking for max_fail_percentage 40074 1727204638.80151: checking to see if all hosts have failed and the running result is not ok 40074 1727204638.80152: done checking to see if all hosts have failed 40074 1727204638.80153: getting the remaining hosts for this loop 40074 1727204638.80156: done getting the remaining hosts for this loop 40074 1727204638.80160: getting the next task for host managed-node2 40074 1727204638.80166: done getting next task for host managed-node2 40074 1727204638.80170: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 40074 1727204638.80174: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204638.80197: getting variables 40074 1727204638.80200: in VariableManager get_vars() 40074 1727204638.80238: Calling all_inventory to load vars for managed-node2 40074 1727204638.80242: Calling groups_inventory to load vars for managed-node2 40074 1727204638.80244: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204638.80254: Calling all_plugins_play to load vars for managed-node2 40074 1727204638.80257: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204638.80260: Calling groups_plugins_play to load vars for managed-node2 40074 1727204638.81523: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204638.83167: done with get_vars() 40074 1727204638.83196: done getting variables 40074 1727204638.83254: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.046) 0:00:32.594 ***** 40074 1727204638.83284: entering _queue_task() for managed-node2/copy 40074 1727204638.83574: worker is 1 (out of 1 available) 40074 1727204638.83591: exiting _queue_task() for managed-node2/copy 40074 1727204638.83605: done queuing things up, now waiting for results queue to drain 40074 1727204638.83607: waiting for pending results... 40074 1727204638.83807: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 40074 1727204638.83926: in run() - task 12b410aa-8751-9fd7-2501-00000000007b 40074 1727204638.83941: variable 'ansible_search_path' from source: unknown 40074 1727204638.83945: variable 'ansible_search_path' from source: unknown 40074 1727204638.83981: calling self._execute() 40074 1727204638.84069: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204638.84083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204638.84095: variable 'omit' from source: magic vars 40074 1727204638.84422: variable 'ansible_distribution_major_version' from source: facts 40074 1727204638.84431: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204638.84532: variable 'network_provider' from source: set_fact 40074 1727204638.84536: Evaluated conditional (network_provider == "initscripts"): False 40074 1727204638.84541: when evaluation is False, skipping this task 40074 1727204638.84546: _execute() done 40074 1727204638.84552: dumping result to json 40074 1727204638.84556: done dumping result, returning 40074 1727204638.84565: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-9fd7-2501-00000000007b] 40074 1727204638.84570: sending task result for task 12b410aa-8751-9fd7-2501-00000000007b 40074 1727204638.84672: done sending task result for task 12b410aa-8751-9fd7-2501-00000000007b 40074 1727204638.84675: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 40074 1727204638.84734: no more pending results, returning what we have 40074 1727204638.84738: results queue empty 40074 1727204638.84739: checking for any_errors_fatal 40074 1727204638.84747: done checking for any_errors_fatal 40074 1727204638.84748: checking for max_fail_percentage 40074 1727204638.84751: done checking for max_fail_percentage 40074 1727204638.84752: checking to see if all hosts have failed and the running result is not ok 40074 1727204638.84753: done checking to see if all hosts have failed 40074 1727204638.84754: getting the remaining hosts for this loop 40074 1727204638.84755: done getting the remaining hosts for this loop 40074 1727204638.84760: getting the next task for host managed-node2 40074 1727204638.84767: done getting next task for host managed-node2 40074 1727204638.84771: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 40074 1727204638.84774: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204638.84805: getting variables 40074 1727204638.84807: in VariableManager get_vars() 40074 1727204638.84848: Calling all_inventory to load vars for managed-node2 40074 1727204638.84851: Calling groups_inventory to load vars for managed-node2 40074 1727204638.84853: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204638.84863: Calling all_plugins_play to load vars for managed-node2 40074 1727204638.84866: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204638.84869: Calling groups_plugins_play to load vars for managed-node2 40074 1727204638.90767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204638.92383: done with get_vars() 40074 1727204638.92414: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:03:58 -0400 (0:00:00.091) 0:00:32.686 ***** 40074 1727204638.92481: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 40074 1727204638.92770: worker is 1 (out of 1 available) 40074 1727204638.92786: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 40074 1727204638.92802: done queuing things up, now waiting for results queue to drain 40074 1727204638.92804: waiting for pending results... 40074 1727204638.92999: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 40074 1727204638.93296: in run() - task 12b410aa-8751-9fd7-2501-00000000007c 40074 1727204638.93301: variable 'ansible_search_path' from source: unknown 40074 1727204638.93305: variable 'ansible_search_path' from source: unknown 40074 1727204638.93309: calling self._execute() 40074 1727204638.93347: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204638.93361: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204638.93378: variable 'omit' from source: magic vars 40074 1727204638.93828: variable 'ansible_distribution_major_version' from source: facts 40074 1727204638.93849: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204638.93863: variable 'omit' from source: magic vars 40074 1727204638.93939: variable 'omit' from source: magic vars 40074 1727204638.94136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204638.96504: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204638.96601: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204638.96650: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204638.96701: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204638.96738: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204638.96840: variable 'network_provider' from source: set_fact 40074 1727204638.97008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204638.97048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204638.97086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204638.97144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204638.97170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204638.97262: variable 'omit' from source: magic vars 40074 1727204638.97407: variable 'omit' from source: magic vars 40074 1727204638.97537: variable 'network_connections' from source: task vars 40074 1727204638.97559: variable 'interface1' from source: play vars 40074 1727204638.97649: variable 'interface1' from source: play vars 40074 1727204638.97748: variable 'interface1_mac' from source: set_fact 40074 1727204638.97998: variable 'omit' from source: magic vars 40074 1727204638.98082: variable '__lsr_ansible_managed' from source: task vars 40074 1727204638.98087: variable '__lsr_ansible_managed' from source: task vars 40074 1727204638.98447: Loaded config def from plugin (lookup/template) 40074 1727204638.98458: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 40074 1727204638.98494: File lookup term: get_ansible_managed.j2 40074 1727204638.98503: variable 'ansible_search_path' from source: unknown 40074 1727204638.98518: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 40074 1727204638.98539: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 40074 1727204638.98560: variable 'ansible_search_path' from source: unknown 40074 1727204639.08903: variable 'ansible_managed' from source: unknown 40074 1727204639.09295: variable 'omit' from source: magic vars 40074 1727204639.09299: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204639.09302: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204639.09305: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204639.09307: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204639.09310: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204639.09346: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204639.09356: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204639.09366: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204639.09502: Set connection var ansible_pipelining to False 40074 1727204639.09516: Set connection var ansible_shell_executable to /bin/sh 40074 1727204639.09525: Set connection var ansible_shell_type to sh 40074 1727204639.09540: Set connection var ansible_connection to ssh 40074 1727204639.09554: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204639.09567: Set connection var ansible_timeout to 10 40074 1727204639.09605: variable 'ansible_shell_executable' from source: unknown 40074 1727204639.09615: variable 'ansible_connection' from source: unknown 40074 1727204639.09624: variable 'ansible_module_compression' from source: unknown 40074 1727204639.09633: variable 'ansible_shell_type' from source: unknown 40074 1727204639.09751: variable 'ansible_shell_executable' from source: unknown 40074 1727204639.09755: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204639.09758: variable 'ansible_pipelining' from source: unknown 40074 1727204639.09760: variable 'ansible_timeout' from source: unknown 40074 1727204639.09763: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204639.09848: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204639.09881: variable 'omit' from source: magic vars 40074 1727204639.09898: starting attempt loop 40074 1727204639.09907: running the handler 40074 1727204639.09927: _low_level_execute_command(): starting 40074 1727204639.09973: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204639.10688: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204639.10718: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204639.10735: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204639.10758: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204639.10873: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204639.10902: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204639.10982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204639.12784: stdout chunk (state=3): >>>/root <<< 40074 1727204639.12898: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204639.12955: stderr chunk (state=3): >>><<< 40074 1727204639.12958: stdout chunk (state=3): >>><<< 40074 1727204639.12979: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204639.12997: _low_level_execute_command(): starting 40074 1727204639.13000: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429 `" && echo ansible-tmp-1727204639.1297956-41440-164227624647429="` echo /root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429 `" ) && sleep 0' 40074 1727204639.13475: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204639.13479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.13481: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204639.13484: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.13535: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204639.13540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204639.13543: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204639.13588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204639.15656: stdout chunk (state=3): >>>ansible-tmp-1727204639.1297956-41440-164227624647429=/root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429 <<< 40074 1727204639.15772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204639.15840: stderr chunk (state=3): >>><<< 40074 1727204639.15843: stdout chunk (state=3): >>><<< 40074 1727204639.15867: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204639.1297956-41440-164227624647429=/root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204639.15912: variable 'ansible_module_compression' from source: unknown 40074 1727204639.15953: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 40074 1727204639.15982: variable 'ansible_facts' from source: unknown 40074 1727204639.16055: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429/AnsiballZ_network_connections.py 40074 1727204639.16176: Sending initial data 40074 1727204639.16179: Sent initial data (168 bytes) 40074 1727204639.16673: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204639.16677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.16683: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204639.16685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.16734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204639.16740: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204639.16788: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204639.18510: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204639.18515: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204639.18556: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp102umsbs /root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429/AnsiballZ_network_connections.py <<< 40074 1727204639.18560: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429/AnsiballZ_network_connections.py" <<< 40074 1727204639.18619: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp102umsbs" to remote "/root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429/AnsiballZ_network_connections.py" <<< 40074 1727204639.20322: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204639.20367: stderr chunk (state=3): >>><<< 40074 1727204639.20377: stdout chunk (state=3): >>><<< 40074 1727204639.20413: done transferring module to remote 40074 1727204639.20432: _low_level_execute_command(): starting 40074 1727204639.20451: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429/ /root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429/AnsiballZ_network_connections.py && sleep 0' 40074 1727204639.21217: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.21274: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204639.21293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204639.21326: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204639.21393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204639.23406: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204639.23512: stderr chunk (state=3): >>><<< 40074 1727204639.23524: stdout chunk (state=3): >>><<< 40074 1727204639.23560: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204639.23569: _low_level_execute_command(): starting 40074 1727204639.23579: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429/AnsiballZ_network_connections.py && sleep 0' 40074 1727204639.24208: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204639.24219: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204639.24246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204639.24253: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204639.24265: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204639.24274: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204639.24285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.24304: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204639.24313: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204639.24405: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 40074 1727204639.24466: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204639.24473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204639.24529: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204639.57386: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "be:be:47:b2:eb:46", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "be:be:47:b2:eb:46", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 40074 1727204639.59415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204639.59476: stderr chunk (state=3): >>><<< 40074 1727204639.59480: stdout chunk (state=3): >>><<< 40074 1727204639.59503: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "be:be:47:b2:eb:46", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest1", "mac": "be:be:47:b2:eb:46", "type": "ethernet", "autoconnect": false, "ip": {"address": ["198.51.100.4/24", "2001:db8::6/32"], "route": [{"network": "198.58.10.64", "prefix": 26, "gateway": "198.51.100.102", "metric": 4}]}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204639.59552: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest1', 'mac': 'be:be:47:b2:eb:46', 'type': 'ethernet', 'autoconnect': False, 'ip': {'address': ['198.51.100.4/24', '2001:db8::6/32'], 'route': [{'network': '198.58.10.64', 'prefix': 26, 'gateway': '198.51.100.102', 'metric': 4}]}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204639.59562: _low_level_execute_command(): starting 40074 1727204639.59568: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204639.1297956-41440-164227624647429/ > /dev/null 2>&1 && sleep 0' 40074 1727204639.60070: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204639.60074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.60077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204639.60079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.60130: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204639.60137: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204639.60179: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204639.62149: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204639.62198: stderr chunk (state=3): >>><<< 40074 1727204639.62201: stdout chunk (state=3): >>><<< 40074 1727204639.62217: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204639.62235: handler run complete 40074 1727204639.62268: attempt loop complete, returning result 40074 1727204639.62271: _execute() done 40074 1727204639.62274: dumping result to json 40074 1727204639.62282: done dumping result, returning 40074 1727204639.62292: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-9fd7-2501-00000000007c] 40074 1727204639.62297: sending task result for task 12b410aa-8751-9fd7-2501-00000000007c 40074 1727204639.62419: done sending task result for task 12b410aa-8751-9fd7-2501-00000000007c 40074 1727204639.62423: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "be:be:47:b2:eb:46", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07 40074 1727204639.62601: no more pending results, returning what we have 40074 1727204639.62605: results queue empty 40074 1727204639.62606: checking for any_errors_fatal 40074 1727204639.62614: done checking for any_errors_fatal 40074 1727204639.62615: checking for max_fail_percentage 40074 1727204639.62617: done checking for max_fail_percentage 40074 1727204639.62618: checking to see if all hosts have failed and the running result is not ok 40074 1727204639.62619: done checking to see if all hosts have failed 40074 1727204639.62620: getting the remaining hosts for this loop 40074 1727204639.62621: done getting the remaining hosts for this loop 40074 1727204639.62625: getting the next task for host managed-node2 40074 1727204639.62632: done getting next task for host managed-node2 40074 1727204639.62636: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 40074 1727204639.62639: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204639.62651: getting variables 40074 1727204639.62653: in VariableManager get_vars() 40074 1727204639.62696: Calling all_inventory to load vars for managed-node2 40074 1727204639.62699: Calling groups_inventory to load vars for managed-node2 40074 1727204639.62710: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204639.62721: Calling all_plugins_play to load vars for managed-node2 40074 1727204639.62724: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204639.62728: Calling groups_plugins_play to load vars for managed-node2 40074 1727204639.63990: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204639.65723: done with get_vars() 40074 1727204639.65746: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.733) 0:00:33.419 ***** 40074 1727204639.65822: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 40074 1727204639.66084: worker is 1 (out of 1 available) 40074 1727204639.66100: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 40074 1727204639.66114: done queuing things up, now waiting for results queue to drain 40074 1727204639.66115: waiting for pending results... 40074 1727204639.66310: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 40074 1727204639.66431: in run() - task 12b410aa-8751-9fd7-2501-00000000007d 40074 1727204639.66449: variable 'ansible_search_path' from source: unknown 40074 1727204639.66452: variable 'ansible_search_path' from source: unknown 40074 1727204639.66487: calling self._execute() 40074 1727204639.66583: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204639.66587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204639.66600: variable 'omit' from source: magic vars 40074 1727204639.66938: variable 'ansible_distribution_major_version' from source: facts 40074 1727204639.66949: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204639.67055: variable 'network_state' from source: role '' defaults 40074 1727204639.67064: Evaluated conditional (network_state != {}): False 40074 1727204639.67068: when evaluation is False, skipping this task 40074 1727204639.67070: _execute() done 40074 1727204639.67076: dumping result to json 40074 1727204639.67080: done dumping result, returning 40074 1727204639.67090: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-9fd7-2501-00000000007d] 40074 1727204639.67096: sending task result for task 12b410aa-8751-9fd7-2501-00000000007d 40074 1727204639.67196: done sending task result for task 12b410aa-8751-9fd7-2501-00000000007d 40074 1727204639.67199: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204639.67278: no more pending results, returning what we have 40074 1727204639.67281: results queue empty 40074 1727204639.67282: checking for any_errors_fatal 40074 1727204639.67294: done checking for any_errors_fatal 40074 1727204639.67295: checking for max_fail_percentage 40074 1727204639.67296: done checking for max_fail_percentage 40074 1727204639.67297: checking to see if all hosts have failed and the running result is not ok 40074 1727204639.67298: done checking to see if all hosts have failed 40074 1727204639.67299: getting the remaining hosts for this loop 40074 1727204639.67301: done getting the remaining hosts for this loop 40074 1727204639.67304: getting the next task for host managed-node2 40074 1727204639.67311: done getting next task for host managed-node2 40074 1727204639.67314: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 40074 1727204639.67320: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204639.67339: getting variables 40074 1727204639.67340: in VariableManager get_vars() 40074 1727204639.67378: Calling all_inventory to load vars for managed-node2 40074 1727204639.67381: Calling groups_inventory to load vars for managed-node2 40074 1727204639.67384: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204639.67402: Calling all_plugins_play to load vars for managed-node2 40074 1727204639.67404: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204639.67407: Calling groups_plugins_play to load vars for managed-node2 40074 1727204639.68601: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204639.70207: done with get_vars() 40074 1727204639.70232: done getting variables 40074 1727204639.70278: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.044) 0:00:33.464 ***** 40074 1727204639.70307: entering _queue_task() for managed-node2/debug 40074 1727204639.70528: worker is 1 (out of 1 available) 40074 1727204639.70543: exiting _queue_task() for managed-node2/debug 40074 1727204639.70556: done queuing things up, now waiting for results queue to drain 40074 1727204639.70558: waiting for pending results... 40074 1727204639.70757: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 40074 1727204639.70864: in run() - task 12b410aa-8751-9fd7-2501-00000000007e 40074 1727204639.70879: variable 'ansible_search_path' from source: unknown 40074 1727204639.70882: variable 'ansible_search_path' from source: unknown 40074 1727204639.70918: calling self._execute() 40074 1727204639.71008: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204639.71020: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204639.71027: variable 'omit' from source: magic vars 40074 1727204639.71361: variable 'ansible_distribution_major_version' from source: facts 40074 1727204639.71372: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204639.71378: variable 'omit' from source: magic vars 40074 1727204639.71432: variable 'omit' from source: magic vars 40074 1727204639.71465: variable 'omit' from source: magic vars 40074 1727204639.71503: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204639.71536: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204639.71556: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204639.71576: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204639.71588: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204639.71617: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204639.71624: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204639.71628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204639.71720: Set connection var ansible_pipelining to False 40074 1727204639.71728: Set connection var ansible_shell_executable to /bin/sh 40074 1727204639.71731: Set connection var ansible_shell_type to sh 40074 1727204639.71734: Set connection var ansible_connection to ssh 40074 1727204639.71741: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204639.71748: Set connection var ansible_timeout to 10 40074 1727204639.71774: variable 'ansible_shell_executable' from source: unknown 40074 1727204639.71779: variable 'ansible_connection' from source: unknown 40074 1727204639.71782: variable 'ansible_module_compression' from source: unknown 40074 1727204639.71784: variable 'ansible_shell_type' from source: unknown 40074 1727204639.71788: variable 'ansible_shell_executable' from source: unknown 40074 1727204639.71790: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204639.71805: variable 'ansible_pipelining' from source: unknown 40074 1727204639.71807: variable 'ansible_timeout' from source: unknown 40074 1727204639.71810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204639.71933: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204639.71944: variable 'omit' from source: magic vars 40074 1727204639.71950: starting attempt loop 40074 1727204639.71953: running the handler 40074 1727204639.72066: variable '__network_connections_result' from source: set_fact 40074 1727204639.72115: handler run complete 40074 1727204639.72138: attempt loop complete, returning result 40074 1727204639.72141: _execute() done 40074 1727204639.72144: dumping result to json 40074 1727204639.72149: done dumping result, returning 40074 1727204639.72158: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-9fd7-2501-00000000007e] 40074 1727204639.72163: sending task result for task 12b410aa-8751-9fd7-2501-00000000007e 40074 1727204639.72256: done sending task result for task 12b410aa-8751-9fd7-2501-00000000007e 40074 1727204639.72259: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07" ] } 40074 1727204639.72329: no more pending results, returning what we have 40074 1727204639.72332: results queue empty 40074 1727204639.72333: checking for any_errors_fatal 40074 1727204639.72339: done checking for any_errors_fatal 40074 1727204639.72340: checking for max_fail_percentage 40074 1727204639.72341: done checking for max_fail_percentage 40074 1727204639.72342: checking to see if all hosts have failed and the running result is not ok 40074 1727204639.72343: done checking to see if all hosts have failed 40074 1727204639.72344: getting the remaining hosts for this loop 40074 1727204639.72346: done getting the remaining hosts for this loop 40074 1727204639.72350: getting the next task for host managed-node2 40074 1727204639.72356: done getting next task for host managed-node2 40074 1727204639.72360: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 40074 1727204639.72363: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204639.72375: getting variables 40074 1727204639.72376: in VariableManager get_vars() 40074 1727204639.72415: Calling all_inventory to load vars for managed-node2 40074 1727204639.72419: Calling groups_inventory to load vars for managed-node2 40074 1727204639.72421: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204639.72430: Calling all_plugins_play to load vars for managed-node2 40074 1727204639.72434: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204639.72437: Calling groups_plugins_play to load vars for managed-node2 40074 1727204639.73806: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204639.75441: done with get_vars() 40074 1727204639.75466: done getting variables 40074 1727204639.75528: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.052) 0:00:33.517 ***** 40074 1727204639.75557: entering _queue_task() for managed-node2/debug 40074 1727204639.75838: worker is 1 (out of 1 available) 40074 1727204639.75854: exiting _queue_task() for managed-node2/debug 40074 1727204639.75868: done queuing things up, now waiting for results queue to drain 40074 1727204639.75870: waiting for pending results... 40074 1727204639.76080: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 40074 1727204639.76191: in run() - task 12b410aa-8751-9fd7-2501-00000000007f 40074 1727204639.76212: variable 'ansible_search_path' from source: unknown 40074 1727204639.76216: variable 'ansible_search_path' from source: unknown 40074 1727204639.76248: calling self._execute() 40074 1727204639.76338: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204639.76345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204639.76357: variable 'omit' from source: magic vars 40074 1727204639.76699: variable 'ansible_distribution_major_version' from source: facts 40074 1727204639.76710: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204639.76717: variable 'omit' from source: magic vars 40074 1727204639.76775: variable 'omit' from source: magic vars 40074 1727204639.76809: variable 'omit' from source: magic vars 40074 1727204639.76847: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204639.76883: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204639.76902: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204639.76919: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204639.76935: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204639.76965: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204639.76968: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204639.76971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204639.77061: Set connection var ansible_pipelining to False 40074 1727204639.77066: Set connection var ansible_shell_executable to /bin/sh 40074 1727204639.77070: Set connection var ansible_shell_type to sh 40074 1727204639.77073: Set connection var ansible_connection to ssh 40074 1727204639.77082: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204639.77092: Set connection var ansible_timeout to 10 40074 1727204639.77116: variable 'ansible_shell_executable' from source: unknown 40074 1727204639.77119: variable 'ansible_connection' from source: unknown 40074 1727204639.77125: variable 'ansible_module_compression' from source: unknown 40074 1727204639.77128: variable 'ansible_shell_type' from source: unknown 40074 1727204639.77133: variable 'ansible_shell_executable' from source: unknown 40074 1727204639.77135: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204639.77141: variable 'ansible_pipelining' from source: unknown 40074 1727204639.77144: variable 'ansible_timeout' from source: unknown 40074 1727204639.77150: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204639.77269: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204639.77280: variable 'omit' from source: magic vars 40074 1727204639.77286: starting attempt loop 40074 1727204639.77291: running the handler 40074 1727204639.77338: variable '__network_connections_result' from source: set_fact 40074 1727204639.77402: variable '__network_connections_result' from source: set_fact 40074 1727204639.77538: handler run complete 40074 1727204639.77558: attempt loop complete, returning result 40074 1727204639.77561: _execute() done 40074 1727204639.77564: dumping result to json 40074 1727204639.77571: done dumping result, returning 40074 1727204639.77579: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-9fd7-2501-00000000007f] 40074 1727204639.77584: sending task result for task 12b410aa-8751-9fd7-2501-00000000007f 40074 1727204639.77683: done sending task result for task 12b410aa-8751-9fd7-2501-00000000007f 40074 1727204639.77685: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": false, "ip": { "address": [ "198.51.100.4/24", "2001:db8::6/32" ], "route": [ { "gateway": "198.51.100.102", "metric": 4, "network": "198.58.10.64", "prefix": 26 } ] }, "mac": "be:be:47:b2:eb:46", "name": "ethtest1", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07\n", "stderr_lines": [ "[002] #0, state:None persistent_state:present, 'ethtest1': update connection ethtest1, 5f0b5761-22ac-495b-91d9-5e5de304bc07" ] } } 40074 1727204639.77803: no more pending results, returning what we have 40074 1727204639.77806: results queue empty 40074 1727204639.77808: checking for any_errors_fatal 40074 1727204639.77815: done checking for any_errors_fatal 40074 1727204639.77816: checking for max_fail_percentage 40074 1727204639.77818: done checking for max_fail_percentage 40074 1727204639.77819: checking to see if all hosts have failed and the running result is not ok 40074 1727204639.77820: done checking to see if all hosts have failed 40074 1727204639.77821: getting the remaining hosts for this loop 40074 1727204639.77823: done getting the remaining hosts for this loop 40074 1727204639.77826: getting the next task for host managed-node2 40074 1727204639.77832: done getting next task for host managed-node2 40074 1727204639.77835: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 40074 1727204639.77838: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204639.77849: getting variables 40074 1727204639.77851: in VariableManager get_vars() 40074 1727204639.77898: Calling all_inventory to load vars for managed-node2 40074 1727204639.77907: Calling groups_inventory to load vars for managed-node2 40074 1727204639.77910: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204639.77919: Calling all_plugins_play to load vars for managed-node2 40074 1727204639.77923: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204639.77926: Calling groups_plugins_play to load vars for managed-node2 40074 1727204639.79156: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204639.80902: done with get_vars() 40074 1727204639.80926: done getting variables 40074 1727204639.80975: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.054) 0:00:33.571 ***** 40074 1727204639.81004: entering _queue_task() for managed-node2/debug 40074 1727204639.81248: worker is 1 (out of 1 available) 40074 1727204639.81265: exiting _queue_task() for managed-node2/debug 40074 1727204639.81280: done queuing things up, now waiting for results queue to drain 40074 1727204639.81281: waiting for pending results... 40074 1727204639.81473: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 40074 1727204639.81581: in run() - task 12b410aa-8751-9fd7-2501-000000000080 40074 1727204639.81595: variable 'ansible_search_path' from source: unknown 40074 1727204639.81598: variable 'ansible_search_path' from source: unknown 40074 1727204639.81635: calling self._execute() 40074 1727204639.81724: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204639.81729: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204639.81742: variable 'omit' from source: magic vars 40074 1727204639.82063: variable 'ansible_distribution_major_version' from source: facts 40074 1727204639.82079: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204639.82180: variable 'network_state' from source: role '' defaults 40074 1727204639.82197: Evaluated conditional (network_state != {}): False 40074 1727204639.82200: when evaluation is False, skipping this task 40074 1727204639.82203: _execute() done 40074 1727204639.82206: dumping result to json 40074 1727204639.82208: done dumping result, returning 40074 1727204639.82215: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-9fd7-2501-000000000080] 40074 1727204639.82221: sending task result for task 12b410aa-8751-9fd7-2501-000000000080 40074 1727204639.82315: done sending task result for task 12b410aa-8751-9fd7-2501-000000000080 40074 1727204639.82321: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 40074 1727204639.82380: no more pending results, returning what we have 40074 1727204639.82383: results queue empty 40074 1727204639.82384: checking for any_errors_fatal 40074 1727204639.82394: done checking for any_errors_fatal 40074 1727204639.82395: checking for max_fail_percentage 40074 1727204639.82396: done checking for max_fail_percentage 40074 1727204639.82397: checking to see if all hosts have failed and the running result is not ok 40074 1727204639.82398: done checking to see if all hosts have failed 40074 1727204639.82399: getting the remaining hosts for this loop 40074 1727204639.82401: done getting the remaining hosts for this loop 40074 1727204639.82404: getting the next task for host managed-node2 40074 1727204639.82411: done getting next task for host managed-node2 40074 1727204639.82415: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 40074 1727204639.82420: ^ state is: HOST STATE: block=3, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204639.82437: getting variables 40074 1727204639.82439: in VariableManager get_vars() 40074 1727204639.82486: Calling all_inventory to load vars for managed-node2 40074 1727204639.82493: Calling groups_inventory to load vars for managed-node2 40074 1727204639.82495: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204639.82504: Calling all_plugins_play to load vars for managed-node2 40074 1727204639.82506: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204639.82508: Calling groups_plugins_play to load vars for managed-node2 40074 1727204639.83727: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204639.85361: done with get_vars() 40074 1727204639.85388: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:03:59 -0400 (0:00:00.044) 0:00:33.616 ***** 40074 1727204639.85470: entering _queue_task() for managed-node2/ping 40074 1727204639.85728: worker is 1 (out of 1 available) 40074 1727204639.85744: exiting _queue_task() for managed-node2/ping 40074 1727204639.85758: done queuing things up, now waiting for results queue to drain 40074 1727204639.85759: waiting for pending results... 40074 1727204639.85954: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 40074 1727204639.86054: in run() - task 12b410aa-8751-9fd7-2501-000000000081 40074 1727204639.86067: variable 'ansible_search_path' from source: unknown 40074 1727204639.86070: variable 'ansible_search_path' from source: unknown 40074 1727204639.86111: calling self._execute() 40074 1727204639.86192: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204639.86199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204639.86212: variable 'omit' from source: magic vars 40074 1727204639.86535: variable 'ansible_distribution_major_version' from source: facts 40074 1727204639.86554: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204639.86560: variable 'omit' from source: magic vars 40074 1727204639.86608: variable 'omit' from source: magic vars 40074 1727204639.86640: variable 'omit' from source: magic vars 40074 1727204639.86681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204639.86716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204639.86736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204639.86755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204639.86771: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204639.86799: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204639.86803: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204639.86807: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204639.86902: Set connection var ansible_pipelining to False 40074 1727204639.86908: Set connection var ansible_shell_executable to /bin/sh 40074 1727204639.86911: Set connection var ansible_shell_type to sh 40074 1727204639.86914: Set connection var ansible_connection to ssh 40074 1727204639.86924: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204639.86930: Set connection var ansible_timeout to 10 40074 1727204639.86953: variable 'ansible_shell_executable' from source: unknown 40074 1727204639.86957: variable 'ansible_connection' from source: unknown 40074 1727204639.86960: variable 'ansible_module_compression' from source: unknown 40074 1727204639.86963: variable 'ansible_shell_type' from source: unknown 40074 1727204639.86967: variable 'ansible_shell_executable' from source: unknown 40074 1727204639.86970: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204639.86978: variable 'ansible_pipelining' from source: unknown 40074 1727204639.86980: variable 'ansible_timeout' from source: unknown 40074 1727204639.86996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204639.87162: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204639.87174: variable 'omit' from source: magic vars 40074 1727204639.87180: starting attempt loop 40074 1727204639.87182: running the handler 40074 1727204639.87198: _low_level_execute_command(): starting 40074 1727204639.87206: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204639.87763: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204639.87767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.87770: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204639.87773: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.87831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204639.87834: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204639.87837: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204639.87889: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204639.89657: stdout chunk (state=3): >>>/root <<< 40074 1727204639.89767: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204639.89822: stderr chunk (state=3): >>><<< 40074 1727204639.89826: stdout chunk (state=3): >>><<< 40074 1727204639.89849: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204639.89862: _low_level_execute_command(): starting 40074 1727204639.89869: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448 `" && echo ansible-tmp-1727204639.8984919-41468-253953587545448="` echo /root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448 `" ) && sleep 0' 40074 1727204639.90336: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204639.90339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.90342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204639.90352: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204639.90355: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.90395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204639.90422: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204639.90450: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204639.92522: stdout chunk (state=3): >>>ansible-tmp-1727204639.8984919-41468-253953587545448=/root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448 <<< 40074 1727204639.92627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204639.92680: stderr chunk (state=3): >>><<< 40074 1727204639.92684: stdout chunk (state=3): >>><<< 40074 1727204639.92704: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204639.8984919-41468-253953587545448=/root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204639.92749: variable 'ansible_module_compression' from source: unknown 40074 1727204639.92792: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 40074 1727204639.92831: variable 'ansible_facts' from source: unknown 40074 1727204639.92882: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448/AnsiballZ_ping.py 40074 1727204639.92999: Sending initial data 40074 1727204639.93003: Sent initial data (153 bytes) 40074 1727204639.93485: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204639.93488: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.93493: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204639.93496: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.93553: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204639.93559: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204639.93600: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204639.95320: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 40074 1727204639.95324: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204639.95354: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204639.95393: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp5pby4jp1 /root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448/AnsiballZ_ping.py <<< 40074 1727204639.95400: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448/AnsiballZ_ping.py" <<< 40074 1727204639.95430: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp5pby4jp1" to remote "/root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448/AnsiballZ_ping.py" <<< 40074 1727204639.95436: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448/AnsiballZ_ping.py" <<< 40074 1727204639.96182: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204639.96251: stderr chunk (state=3): >>><<< 40074 1727204639.96255: stdout chunk (state=3): >>><<< 40074 1727204639.96277: done transferring module to remote 40074 1727204639.96292: _low_level_execute_command(): starting 40074 1727204639.96299: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448/ /root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448/AnsiballZ_ping.py && sleep 0' 40074 1727204639.96771: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204639.96775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204639.96777: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.96779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204639.96782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.96842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204639.96849: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204639.96885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204639.98808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204639.98856: stderr chunk (state=3): >>><<< 40074 1727204639.98859: stdout chunk (state=3): >>><<< 40074 1727204639.98871: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204639.98874: _low_level_execute_command(): starting 40074 1727204639.98884: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448/AnsiballZ_ping.py && sleep 0' 40074 1727204639.99336: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204639.99339: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.99342: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204639.99344: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204639.99397: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204639.99403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204639.99463: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204640.16910: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 40074 1727204640.18363: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204640.18428: stderr chunk (state=3): >>><<< 40074 1727204640.18432: stdout chunk (state=3): >>><<< 40074 1727204640.18450: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204640.18475: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204640.18485: _low_level_execute_command(): starting 40074 1727204640.18494: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204639.8984919-41468-253953587545448/ > /dev/null 2>&1 && sleep 0' 40074 1727204640.18988: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204640.18993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204640.19004: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204640.19010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204640.19057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204640.19063: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204640.19112: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204640.21141: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204640.21202: stderr chunk (state=3): >>><<< 40074 1727204640.21206: stdout chunk (state=3): >>><<< 40074 1727204640.21223: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204640.21231: handler run complete 40074 1727204640.21248: attempt loop complete, returning result 40074 1727204640.21251: _execute() done 40074 1727204640.21255: dumping result to json 40074 1727204640.21260: done dumping result, returning 40074 1727204640.21273: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-9fd7-2501-000000000081] 40074 1727204640.21279: sending task result for task 12b410aa-8751-9fd7-2501-000000000081 40074 1727204640.21378: done sending task result for task 12b410aa-8751-9fd7-2501-000000000081 40074 1727204640.21381: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 40074 1727204640.21463: no more pending results, returning what we have 40074 1727204640.21467: results queue empty 40074 1727204640.21468: checking for any_errors_fatal 40074 1727204640.21477: done checking for any_errors_fatal 40074 1727204640.21478: checking for max_fail_percentage 40074 1727204640.21480: done checking for max_fail_percentage 40074 1727204640.21481: checking to see if all hosts have failed and the running result is not ok 40074 1727204640.21483: done checking to see if all hosts have failed 40074 1727204640.21484: getting the remaining hosts for this loop 40074 1727204640.21485: done getting the remaining hosts for this loop 40074 1727204640.21497: getting the next task for host managed-node2 40074 1727204640.21513: done getting next task for host managed-node2 40074 1727204640.21515: ^ task is: TASK: meta (role_complete) 40074 1727204640.21521: ^ state is: HOST STATE: block=3, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204640.21534: getting variables 40074 1727204640.21536: in VariableManager get_vars() 40074 1727204640.21581: Calling all_inventory to load vars for managed-node2 40074 1727204640.21585: Calling groups_inventory to load vars for managed-node2 40074 1727204640.21587: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204640.21599: Calling all_plugins_play to load vars for managed-node2 40074 1727204640.21610: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204640.21615: Calling groups_plugins_play to load vars for managed-node2 40074 1727204640.23077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204640.24735: done with get_vars() 40074 1727204640.24769: done getting variables 40074 1727204640.24844: done queuing things up, now waiting for results queue to drain 40074 1727204640.24846: results queue empty 40074 1727204640.24847: checking for any_errors_fatal 40074 1727204640.24850: done checking for any_errors_fatal 40074 1727204640.24850: checking for max_fail_percentage 40074 1727204640.24851: done checking for max_fail_percentage 40074 1727204640.24852: checking to see if all hosts have failed and the running result is not ok 40074 1727204640.24853: done checking to see if all hosts have failed 40074 1727204640.24854: getting the remaining hosts for this loop 40074 1727204640.24855: done getting the remaining hosts for this loop 40074 1727204640.24857: getting the next task for host managed-node2 40074 1727204640.24861: done getting next task for host managed-node2 40074 1727204640.24863: ^ task is: TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 40074 1727204640.24865: ^ state is: HOST STATE: block=3, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204640.24868: getting variables 40074 1727204640.24869: in VariableManager get_vars() 40074 1727204640.24882: Calling all_inventory to load vars for managed-node2 40074 1727204640.24884: Calling groups_inventory to load vars for managed-node2 40074 1727204640.24885: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204640.24891: Calling all_plugins_play to load vars for managed-node2 40074 1727204640.24893: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204640.24896: Calling groups_plugins_play to load vars for managed-node2 40074 1727204640.26023: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204640.27743: done with get_vars() 40074 1727204640.27765: done getting variables 40074 1727204640.27811: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that the warning about specifying the route without the output device is logged for initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:122 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.423) 0:00:34.040 ***** 40074 1727204640.27835: entering _queue_task() for managed-node2/assert 40074 1727204640.28118: worker is 1 (out of 1 available) 40074 1727204640.28136: exiting _queue_task() for managed-node2/assert 40074 1727204640.28151: done queuing things up, now waiting for results queue to drain 40074 1727204640.28153: waiting for pending results... 40074 1727204640.28363: running TaskExecutor() for managed-node2/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider 40074 1727204640.28444: in run() - task 12b410aa-8751-9fd7-2501-0000000000b1 40074 1727204640.28456: variable 'ansible_search_path' from source: unknown 40074 1727204640.28492: calling self._execute() 40074 1727204640.28593: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204640.28603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204640.28616: variable 'omit' from source: magic vars 40074 1727204640.29006: variable 'ansible_distribution_major_version' from source: facts 40074 1727204640.29028: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204640.29144: variable 'network_provider' from source: set_fact 40074 1727204640.29149: Evaluated conditional (network_provider == "initscripts"): False 40074 1727204640.29154: when evaluation is False, skipping this task 40074 1727204640.29157: _execute() done 40074 1727204640.29160: dumping result to json 40074 1727204640.29174: done dumping result, returning 40074 1727204640.29178: done running TaskExecutor() for managed-node2/TASK: Assert that the warning about specifying the route without the output device is logged for initscripts provider [12b410aa-8751-9fd7-2501-0000000000b1] 40074 1727204640.29181: sending task result for task 12b410aa-8751-9fd7-2501-0000000000b1 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 40074 1727204640.29351: no more pending results, returning what we have 40074 1727204640.29355: results queue empty 40074 1727204640.29356: checking for any_errors_fatal 40074 1727204640.29358: done checking for any_errors_fatal 40074 1727204640.29359: checking for max_fail_percentage 40074 1727204640.29361: done checking for max_fail_percentage 40074 1727204640.29362: checking to see if all hosts have failed and the running result is not ok 40074 1727204640.29363: done checking to see if all hosts have failed 40074 1727204640.29364: getting the remaining hosts for this loop 40074 1727204640.29366: done getting the remaining hosts for this loop 40074 1727204640.29370: getting the next task for host managed-node2 40074 1727204640.29378: done getting next task for host managed-node2 40074 1727204640.29383: ^ task is: TASK: Assert that no warning is logged for nm provider 40074 1727204640.29386: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204640.29389: getting variables 40074 1727204640.29392: in VariableManager get_vars() 40074 1727204640.29438: Calling all_inventory to load vars for managed-node2 40074 1727204640.29441: Calling groups_inventory to load vars for managed-node2 40074 1727204640.29444: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204640.29455: Calling all_plugins_play to load vars for managed-node2 40074 1727204640.29458: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204640.29462: Calling groups_plugins_play to load vars for managed-node2 40074 1727204640.30514: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000b1 40074 1727204640.30777: WORKER PROCESS EXITING 40074 1727204640.30793: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204640.32405: done with get_vars() 40074 1727204640.32433: done getting variables 40074 1727204640.32484: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that no warning is logged for nm provider] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:129 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.046) 0:00:34.086 ***** 40074 1727204640.32510: entering _queue_task() for managed-node2/assert 40074 1727204640.32779: worker is 1 (out of 1 available) 40074 1727204640.32794: exiting _queue_task() for managed-node2/assert 40074 1727204640.32809: done queuing things up, now waiting for results queue to drain 40074 1727204640.32811: waiting for pending results... 40074 1727204640.33014: running TaskExecutor() for managed-node2/TASK: Assert that no warning is logged for nm provider 40074 1727204640.33096: in run() - task 12b410aa-8751-9fd7-2501-0000000000b2 40074 1727204640.33111: variable 'ansible_search_path' from source: unknown 40074 1727204640.33146: calling self._execute() 40074 1727204640.33242: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204640.33249: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204640.33263: variable 'omit' from source: magic vars 40074 1727204640.33816: variable 'ansible_distribution_major_version' from source: facts 40074 1727204640.33842: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204640.34006: variable 'network_provider' from source: set_fact 40074 1727204640.34024: Evaluated conditional (network_provider == "nm"): True 40074 1727204640.34038: variable 'omit' from source: magic vars 40074 1727204640.34070: variable 'omit' from source: magic vars 40074 1727204640.34129: variable 'omit' from source: magic vars 40074 1727204640.34395: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204640.34399: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204640.34402: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204640.34404: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204640.34406: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204640.34409: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204640.34411: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204640.34413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204640.34540: Set connection var ansible_pipelining to False 40074 1727204640.34556: Set connection var ansible_shell_executable to /bin/sh 40074 1727204640.34566: Set connection var ansible_shell_type to sh 40074 1727204640.34575: Set connection var ansible_connection to ssh 40074 1727204640.34591: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204640.34606: Set connection var ansible_timeout to 10 40074 1727204640.34646: variable 'ansible_shell_executable' from source: unknown 40074 1727204640.34658: variable 'ansible_connection' from source: unknown 40074 1727204640.34668: variable 'ansible_module_compression' from source: unknown 40074 1727204640.34678: variable 'ansible_shell_type' from source: unknown 40074 1727204640.34687: variable 'ansible_shell_executable' from source: unknown 40074 1727204640.34699: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204640.34710: variable 'ansible_pipelining' from source: unknown 40074 1727204640.34721: variable 'ansible_timeout' from source: unknown 40074 1727204640.34733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204640.34916: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204640.34942: variable 'omit' from source: magic vars 40074 1727204640.34956: starting attempt loop 40074 1727204640.34966: running the handler 40074 1727204640.35186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204640.35485: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204640.35551: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204640.35651: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204640.35701: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204640.35829: variable '__network_connections_result' from source: set_fact 40074 1727204640.35864: Evaluated conditional (__network_connections_result.stderr is not search("")): True 40074 1727204640.35880: handler run complete 40074 1727204640.35910: attempt loop complete, returning result 40074 1727204640.35924: _execute() done 40074 1727204640.36095: dumping result to json 40074 1727204640.36098: done dumping result, returning 40074 1727204640.36101: done running TaskExecutor() for managed-node2/TASK: Assert that no warning is logged for nm provider [12b410aa-8751-9fd7-2501-0000000000b2] 40074 1727204640.36103: sending task result for task 12b410aa-8751-9fd7-2501-0000000000b2 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 40074 1727204640.36249: no more pending results, returning what we have 40074 1727204640.36254: results queue empty 40074 1727204640.36255: checking for any_errors_fatal 40074 1727204640.36267: done checking for any_errors_fatal 40074 1727204640.36268: checking for max_fail_percentage 40074 1727204640.36270: done checking for max_fail_percentage 40074 1727204640.36272: checking to see if all hosts have failed and the running result is not ok 40074 1727204640.36273: done checking to see if all hosts have failed 40074 1727204640.36274: getting the remaining hosts for this loop 40074 1727204640.36276: done getting the remaining hosts for this loop 40074 1727204640.36280: getting the next task for host managed-node2 40074 1727204640.36295: done getting next task for host managed-node2 40074 1727204640.36300: ^ task is: TASK: Bring down test devices and profiles 40074 1727204640.36305: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204640.36311: getting variables 40074 1727204640.36313: in VariableManager get_vars() 40074 1727204640.36368: Calling all_inventory to load vars for managed-node2 40074 1727204640.36372: Calling groups_inventory to load vars for managed-node2 40074 1727204640.36375: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204640.36550: Calling all_plugins_play to load vars for managed-node2 40074 1727204640.36557: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204640.36563: Calling groups_plugins_play to load vars for managed-node2 40074 1727204640.37407: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000b2 40074 1727204640.37412: WORKER PROCESS EXITING 40074 1727204640.38913: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204640.40537: done with get_vars() 40074 1727204640.40565: done getting variables TASK [Bring down test devices and profiles] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:140 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.081) 0:00:34.168 ***** 40074 1727204640.40658: entering _queue_task() for managed-node2/include_role 40074 1727204640.40659: Creating lock for include_role 40074 1727204640.40949: worker is 1 (out of 1 available) 40074 1727204640.40964: exiting _queue_task() for managed-node2/include_role 40074 1727204640.40978: done queuing things up, now waiting for results queue to drain 40074 1727204640.40979: waiting for pending results... 40074 1727204640.41175: running TaskExecutor() for managed-node2/TASK: Bring down test devices and profiles 40074 1727204640.41279: in run() - task 12b410aa-8751-9fd7-2501-0000000000b4 40074 1727204640.41294: variable 'ansible_search_path' from source: unknown 40074 1727204640.41331: calling self._execute() 40074 1727204640.41424: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204640.41428: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204640.41444: variable 'omit' from source: magic vars 40074 1727204640.41772: variable 'ansible_distribution_major_version' from source: facts 40074 1727204640.41780: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204640.41786: _execute() done 40074 1727204640.41794: dumping result to json 40074 1727204640.41799: done dumping result, returning 40074 1727204640.41804: done running TaskExecutor() for managed-node2/TASK: Bring down test devices and profiles [12b410aa-8751-9fd7-2501-0000000000b4] 40074 1727204640.41810: sending task result for task 12b410aa-8751-9fd7-2501-0000000000b4 40074 1727204640.41938: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000b4 40074 1727204640.41941: WORKER PROCESS EXITING 40074 1727204640.41977: no more pending results, returning what we have 40074 1727204640.41982: in VariableManager get_vars() 40074 1727204640.42033: Calling all_inventory to load vars for managed-node2 40074 1727204640.42037: Calling groups_inventory to load vars for managed-node2 40074 1727204640.42040: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204640.42054: Calling all_plugins_play to load vars for managed-node2 40074 1727204640.42058: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204640.42061: Calling groups_plugins_play to load vars for managed-node2 40074 1727204640.43318: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204640.45036: done with get_vars() 40074 1727204640.45058: variable 'ansible_search_path' from source: unknown 40074 1727204640.45266: variable 'omit' from source: magic vars 40074 1727204640.45296: variable 'omit' from source: magic vars 40074 1727204640.45309: variable 'omit' from source: magic vars 40074 1727204640.45314: we have included files to process 40074 1727204640.45315: generating all_blocks data 40074 1727204640.45317: done generating all_blocks data 40074 1727204640.45321: processing included file: fedora.linux_system_roles.network 40074 1727204640.45339: in VariableManager get_vars() 40074 1727204640.45354: done with get_vars() 40074 1727204640.45378: in VariableManager get_vars() 40074 1727204640.45394: done with get_vars() 40074 1727204640.45436: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 40074 1727204640.45533: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 40074 1727204640.45598: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 40074 1727204640.45974: in VariableManager get_vars() 40074 1727204640.45995: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 40074 1727204640.47651: iterating over new_blocks loaded from include file 40074 1727204640.47653: in VariableManager get_vars() 40074 1727204640.47671: done with get_vars() 40074 1727204640.47673: filtering new block on tags 40074 1727204640.47888: done filtering new block on tags 40074 1727204640.47894: in VariableManager get_vars() 40074 1727204640.47908: done with get_vars() 40074 1727204640.47909: filtering new block on tags 40074 1727204640.47925: done filtering new block on tags 40074 1727204640.47927: done iterating over new_blocks loaded from include file included: fedora.linux_system_roles.network for managed-node2 40074 1727204640.47933: extending task lists for all hosts with included blocks 40074 1727204640.48106: done extending task lists 40074 1727204640.48107: done processing included files 40074 1727204640.48108: results queue empty 40074 1727204640.48108: checking for any_errors_fatal 40074 1727204640.48112: done checking for any_errors_fatal 40074 1727204640.48112: checking for max_fail_percentage 40074 1727204640.48113: done checking for max_fail_percentage 40074 1727204640.48114: checking to see if all hosts have failed and the running result is not ok 40074 1727204640.48115: done checking to see if all hosts have failed 40074 1727204640.48115: getting the remaining hosts for this loop 40074 1727204640.48116: done getting the remaining hosts for this loop 40074 1727204640.48120: getting the next task for host managed-node2 40074 1727204640.48123: done getting next task for host managed-node2 40074 1727204640.48125: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 40074 1727204640.48128: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204640.48136: getting variables 40074 1727204640.48137: in VariableManager get_vars() 40074 1727204640.48154: Calling all_inventory to load vars for managed-node2 40074 1727204640.48156: Calling groups_inventory to load vars for managed-node2 40074 1727204640.48159: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204640.48164: Calling all_plugins_play to load vars for managed-node2 40074 1727204640.48166: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204640.48168: Calling groups_plugins_play to load vars for managed-node2 40074 1727204640.49433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204640.51636: done with get_vars() 40074 1727204640.51664: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.110) 0:00:34.279 ***** 40074 1727204640.51739: entering _queue_task() for managed-node2/include_tasks 40074 1727204640.52039: worker is 1 (out of 1 available) 40074 1727204640.52055: exiting _queue_task() for managed-node2/include_tasks 40074 1727204640.52068: done queuing things up, now waiting for results queue to drain 40074 1727204640.52070: waiting for pending results... 40074 1727204640.52268: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 40074 1727204640.52385: in run() - task 12b410aa-8751-9fd7-2501-000000000641 40074 1727204640.52401: variable 'ansible_search_path' from source: unknown 40074 1727204640.52405: variable 'ansible_search_path' from source: unknown 40074 1727204640.52440: calling self._execute() 40074 1727204640.52528: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204640.52532: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204640.52543: variable 'omit' from source: magic vars 40074 1727204640.52872: variable 'ansible_distribution_major_version' from source: facts 40074 1727204640.52884: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204640.52893: _execute() done 40074 1727204640.52898: dumping result to json 40074 1727204640.52902: done dumping result, returning 40074 1727204640.52910: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-9fd7-2501-000000000641] 40074 1727204640.52915: sending task result for task 12b410aa-8751-9fd7-2501-000000000641 40074 1727204640.53010: done sending task result for task 12b410aa-8751-9fd7-2501-000000000641 40074 1727204640.53013: WORKER PROCESS EXITING 40074 1727204640.53078: no more pending results, returning what we have 40074 1727204640.53083: in VariableManager get_vars() 40074 1727204640.53143: Calling all_inventory to load vars for managed-node2 40074 1727204640.53147: Calling groups_inventory to load vars for managed-node2 40074 1727204640.53150: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204640.53162: Calling all_plugins_play to load vars for managed-node2 40074 1727204640.53165: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204640.53168: Calling groups_plugins_play to load vars for managed-node2 40074 1727204640.55511: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204640.58782: done with get_vars() 40074 1727204640.58835: variable 'ansible_search_path' from source: unknown 40074 1727204640.58836: variable 'ansible_search_path' from source: unknown 40074 1727204640.58902: we have included files to process 40074 1727204640.58904: generating all_blocks data 40074 1727204640.58907: done generating all_blocks data 40074 1727204640.58911: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 40074 1727204640.58913: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 40074 1727204640.58915: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 40074 1727204640.59679: done processing included file 40074 1727204640.59681: iterating over new_blocks loaded from include file 40074 1727204640.59682: in VariableManager get_vars() 40074 1727204640.59707: done with get_vars() 40074 1727204640.59709: filtering new block on tags 40074 1727204640.59735: done filtering new block on tags 40074 1727204640.59738: in VariableManager get_vars() 40074 1727204640.59760: done with get_vars() 40074 1727204640.59762: filtering new block on tags 40074 1727204640.59797: done filtering new block on tags 40074 1727204640.59800: in VariableManager get_vars() 40074 1727204640.59821: done with get_vars() 40074 1727204640.59822: filtering new block on tags 40074 1727204640.59859: done filtering new block on tags 40074 1727204640.59862: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 40074 1727204640.59866: extending task lists for all hosts with included blocks 40074 1727204640.60724: done extending task lists 40074 1727204640.60725: done processing included files 40074 1727204640.60726: results queue empty 40074 1727204640.60726: checking for any_errors_fatal 40074 1727204640.60730: done checking for any_errors_fatal 40074 1727204640.60730: checking for max_fail_percentage 40074 1727204640.60731: done checking for max_fail_percentage 40074 1727204640.60732: checking to see if all hosts have failed and the running result is not ok 40074 1727204640.60733: done checking to see if all hosts have failed 40074 1727204640.60733: getting the remaining hosts for this loop 40074 1727204640.60734: done getting the remaining hosts for this loop 40074 1727204640.60736: getting the next task for host managed-node2 40074 1727204640.60739: done getting next task for host managed-node2 40074 1727204640.60742: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 40074 1727204640.60744: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204640.60754: getting variables 40074 1727204640.60755: in VariableManager get_vars() 40074 1727204640.60770: Calling all_inventory to load vars for managed-node2 40074 1727204640.60772: Calling groups_inventory to load vars for managed-node2 40074 1727204640.60774: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204640.60778: Calling all_plugins_play to load vars for managed-node2 40074 1727204640.60781: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204640.60783: Calling groups_plugins_play to load vars for managed-node2 40074 1727204640.61961: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204640.64041: done with get_vars() 40074 1727204640.64068: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.123) 0:00:34.403 ***** 40074 1727204640.64141: entering _queue_task() for managed-node2/setup 40074 1727204640.64435: worker is 1 (out of 1 available) 40074 1727204640.64451: exiting _queue_task() for managed-node2/setup 40074 1727204640.64464: done queuing things up, now waiting for results queue to drain 40074 1727204640.64465: waiting for pending results... 40074 1727204640.64671: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 40074 1727204640.64801: in run() - task 12b410aa-8751-9fd7-2501-0000000006a7 40074 1727204640.64823: variable 'ansible_search_path' from source: unknown 40074 1727204640.64827: variable 'ansible_search_path' from source: unknown 40074 1727204640.64855: calling self._execute() 40074 1727204640.64947: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204640.64954: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204640.64964: variable 'omit' from source: magic vars 40074 1727204640.65296: variable 'ansible_distribution_major_version' from source: facts 40074 1727204640.65308: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204640.65505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204640.67796: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204640.67862: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204640.67897: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204640.67928: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204640.67957: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204640.68028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204640.68056: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204640.68078: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204640.68113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204640.68126: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204640.68175: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204640.68198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204640.68220: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204640.68250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204640.68263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204640.68401: variable '__network_required_facts' from source: role '' defaults 40074 1727204640.68409: variable 'ansible_facts' from source: unknown 40074 1727204640.69111: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 40074 1727204640.69116: when evaluation is False, skipping this task 40074 1727204640.69121: _execute() done 40074 1727204640.69125: dumping result to json 40074 1727204640.69127: done dumping result, returning 40074 1727204640.69135: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-9fd7-2501-0000000006a7] 40074 1727204640.69139: sending task result for task 12b410aa-8751-9fd7-2501-0000000006a7 40074 1727204640.69242: done sending task result for task 12b410aa-8751-9fd7-2501-0000000006a7 40074 1727204640.69245: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204640.69309: no more pending results, returning what we have 40074 1727204640.69314: results queue empty 40074 1727204640.69315: checking for any_errors_fatal 40074 1727204640.69318: done checking for any_errors_fatal 40074 1727204640.69319: checking for max_fail_percentage 40074 1727204640.69321: done checking for max_fail_percentage 40074 1727204640.69323: checking to see if all hosts have failed and the running result is not ok 40074 1727204640.69324: done checking to see if all hosts have failed 40074 1727204640.69325: getting the remaining hosts for this loop 40074 1727204640.69326: done getting the remaining hosts for this loop 40074 1727204640.69330: getting the next task for host managed-node2 40074 1727204640.69341: done getting next task for host managed-node2 40074 1727204640.69345: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 40074 1727204640.69351: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204640.69377: getting variables 40074 1727204640.69379: in VariableManager get_vars() 40074 1727204640.69428: Calling all_inventory to load vars for managed-node2 40074 1727204640.69432: Calling groups_inventory to load vars for managed-node2 40074 1727204640.69434: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204640.69445: Calling all_plugins_play to load vars for managed-node2 40074 1727204640.69448: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204640.69452: Calling groups_plugins_play to load vars for managed-node2 40074 1727204640.71534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204640.73374: done with get_vars() 40074 1727204640.73401: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.093) 0:00:34.496 ***** 40074 1727204640.73496: entering _queue_task() for managed-node2/stat 40074 1727204640.73769: worker is 1 (out of 1 available) 40074 1727204640.73783: exiting _queue_task() for managed-node2/stat 40074 1727204640.73798: done queuing things up, now waiting for results queue to drain 40074 1727204640.73800: waiting for pending results... 40074 1727204640.73999: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 40074 1727204640.74124: in run() - task 12b410aa-8751-9fd7-2501-0000000006a9 40074 1727204640.74139: variable 'ansible_search_path' from source: unknown 40074 1727204640.74145: variable 'ansible_search_path' from source: unknown 40074 1727204640.74174: calling self._execute() 40074 1727204640.74262: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204640.74273: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204640.74283: variable 'omit' from source: magic vars 40074 1727204640.74620: variable 'ansible_distribution_major_version' from source: facts 40074 1727204640.74633: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204640.74776: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204640.75294: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204640.75298: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204640.75300: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204640.75302: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204640.75338: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204640.75373: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204640.75411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204640.75453: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204640.75565: variable '__network_is_ostree' from source: set_fact 40074 1727204640.75579: Evaluated conditional (not __network_is_ostree is defined): False 40074 1727204640.75588: when evaluation is False, skipping this task 40074 1727204640.75599: _execute() done 40074 1727204640.75608: dumping result to json 40074 1727204640.75616: done dumping result, returning 40074 1727204640.75633: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-9fd7-2501-0000000006a9] 40074 1727204640.75644: sending task result for task 12b410aa-8751-9fd7-2501-0000000006a9 40074 1727204640.75767: done sending task result for task 12b410aa-8751-9fd7-2501-0000000006a9 skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 40074 1727204640.75835: no more pending results, returning what we have 40074 1727204640.75839: results queue empty 40074 1727204640.75840: checking for any_errors_fatal 40074 1727204640.75851: done checking for any_errors_fatal 40074 1727204640.75852: checking for max_fail_percentage 40074 1727204640.75854: done checking for max_fail_percentage 40074 1727204640.75855: checking to see if all hosts have failed and the running result is not ok 40074 1727204640.75856: done checking to see if all hosts have failed 40074 1727204640.75857: getting the remaining hosts for this loop 40074 1727204640.75859: done getting the remaining hosts for this loop 40074 1727204640.75863: getting the next task for host managed-node2 40074 1727204640.75871: done getting next task for host managed-node2 40074 1727204640.75874: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 40074 1727204640.75908: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204640.75938: getting variables 40074 1727204640.75940: in VariableManager get_vars() 40074 1727204640.76059: Calling all_inventory to load vars for managed-node2 40074 1727204640.76062: Calling groups_inventory to load vars for managed-node2 40074 1727204640.76065: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204640.76077: Calling all_plugins_play to load vars for managed-node2 40074 1727204640.76080: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204640.76084: Calling groups_plugins_play to load vars for managed-node2 40074 1727204640.76605: WORKER PROCESS EXITING 40074 1727204640.77418: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204640.79075: done with get_vars() 40074 1727204640.79111: done getting variables 40074 1727204640.79175: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.057) 0:00:34.553 ***** 40074 1727204640.79210: entering _queue_task() for managed-node2/set_fact 40074 1727204640.79509: worker is 1 (out of 1 available) 40074 1727204640.79527: exiting _queue_task() for managed-node2/set_fact 40074 1727204640.79544: done queuing things up, now waiting for results queue to drain 40074 1727204640.79545: waiting for pending results... 40074 1727204640.79756: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 40074 1727204640.79887: in run() - task 12b410aa-8751-9fd7-2501-0000000006aa 40074 1727204640.79909: variable 'ansible_search_path' from source: unknown 40074 1727204640.79914: variable 'ansible_search_path' from source: unknown 40074 1727204640.79947: calling self._execute() 40074 1727204640.80040: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204640.80048: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204640.80057: variable 'omit' from source: magic vars 40074 1727204640.80391: variable 'ansible_distribution_major_version' from source: facts 40074 1727204640.80403: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204640.80557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204640.80796: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204640.80836: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204640.80868: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204640.80901: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204640.80974: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204640.81003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204640.81028: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204640.81049: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204640.81132: variable '__network_is_ostree' from source: set_fact 40074 1727204640.81140: Evaluated conditional (not __network_is_ostree is defined): False 40074 1727204640.81143: when evaluation is False, skipping this task 40074 1727204640.81146: _execute() done 40074 1727204640.81153: dumping result to json 40074 1727204640.81157: done dumping result, returning 40074 1727204640.81166: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-9fd7-2501-0000000006aa] 40074 1727204640.81171: sending task result for task 12b410aa-8751-9fd7-2501-0000000006aa 40074 1727204640.81285: done sending task result for task 12b410aa-8751-9fd7-2501-0000000006aa 40074 1727204640.81288: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 40074 1727204640.81365: no more pending results, returning what we have 40074 1727204640.81369: results queue empty 40074 1727204640.81370: checking for any_errors_fatal 40074 1727204640.81376: done checking for any_errors_fatal 40074 1727204640.81377: checking for max_fail_percentage 40074 1727204640.81379: done checking for max_fail_percentage 40074 1727204640.81380: checking to see if all hosts have failed and the running result is not ok 40074 1727204640.81381: done checking to see if all hosts have failed 40074 1727204640.81382: getting the remaining hosts for this loop 40074 1727204640.81384: done getting the remaining hosts for this loop 40074 1727204640.81388: getting the next task for host managed-node2 40074 1727204640.81407: done getting next task for host managed-node2 40074 1727204640.81412: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 40074 1727204640.81421: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204640.81441: getting variables 40074 1727204640.81443: in VariableManager get_vars() 40074 1727204640.81486: Calling all_inventory to load vars for managed-node2 40074 1727204640.81494: Calling groups_inventory to load vars for managed-node2 40074 1727204640.81497: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204640.81508: Calling all_plugins_play to load vars for managed-node2 40074 1727204640.81513: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204640.81516: Calling groups_plugins_play to load vars for managed-node2 40074 1727204640.86912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204640.88538: done with get_vars() 40074 1727204640.88565: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 15:04:00 -0400 (0:00:00.094) 0:00:34.648 ***** 40074 1727204640.88647: entering _queue_task() for managed-node2/service_facts 40074 1727204640.88946: worker is 1 (out of 1 available) 40074 1727204640.88961: exiting _queue_task() for managed-node2/service_facts 40074 1727204640.88975: done queuing things up, now waiting for results queue to drain 40074 1727204640.88978: waiting for pending results... 40074 1727204640.89180: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 40074 1727204640.89325: in run() - task 12b410aa-8751-9fd7-2501-0000000006ac 40074 1727204640.89337: variable 'ansible_search_path' from source: unknown 40074 1727204640.89341: variable 'ansible_search_path' from source: unknown 40074 1727204640.89373: calling self._execute() 40074 1727204640.89464: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204640.89471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204640.89481: variable 'omit' from source: magic vars 40074 1727204640.89820: variable 'ansible_distribution_major_version' from source: facts 40074 1727204640.89829: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204640.89836: variable 'omit' from source: magic vars 40074 1727204640.89901: variable 'omit' from source: magic vars 40074 1727204640.89929: variable 'omit' from source: magic vars 40074 1727204640.89968: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204640.90006: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204640.90024: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204640.90041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204640.90053: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204640.90081: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204640.90085: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204640.90088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204640.90177: Set connection var ansible_pipelining to False 40074 1727204640.90183: Set connection var ansible_shell_executable to /bin/sh 40074 1727204640.90186: Set connection var ansible_shell_type to sh 40074 1727204640.90190: Set connection var ansible_connection to ssh 40074 1727204640.90198: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204640.90214: Set connection var ansible_timeout to 10 40074 1727204640.90233: variable 'ansible_shell_executable' from source: unknown 40074 1727204640.90236: variable 'ansible_connection' from source: unknown 40074 1727204640.90239: variable 'ansible_module_compression' from source: unknown 40074 1727204640.90243: variable 'ansible_shell_type' from source: unknown 40074 1727204640.90246: variable 'ansible_shell_executable' from source: unknown 40074 1727204640.90251: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204640.90256: variable 'ansible_pipelining' from source: unknown 40074 1727204640.90259: variable 'ansible_timeout' from source: unknown 40074 1727204640.90265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204640.90441: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204640.90450: variable 'omit' from source: magic vars 40074 1727204640.90456: starting attempt loop 40074 1727204640.90460: running the handler 40074 1727204640.90474: _low_level_execute_command(): starting 40074 1727204640.90481: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204640.91045: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204640.91049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204640.91052: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204640.91056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204640.91114: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204640.91120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204640.91167: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204640.92959: stdout chunk (state=3): >>>/root <<< 40074 1727204640.93069: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204640.93129: stderr chunk (state=3): >>><<< 40074 1727204640.93132: stdout chunk (state=3): >>><<< 40074 1727204640.93162: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204640.93173: _low_level_execute_command(): starting 40074 1727204640.93179: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319 `" && echo ansible-tmp-1727204640.931602-41498-236312438164319="` echo /root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319 `" ) && sleep 0' 40074 1727204640.93664: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204640.93669: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204640.93673: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204640.93684: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204640.93741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204640.93748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204640.93750: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204640.93787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204640.95854: stdout chunk (state=3): >>>ansible-tmp-1727204640.931602-41498-236312438164319=/root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319 <<< 40074 1727204640.95969: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204640.96029: stderr chunk (state=3): >>><<< 40074 1727204640.96033: stdout chunk (state=3): >>><<< 40074 1727204640.96053: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204640.931602-41498-236312438164319=/root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204640.96102: variable 'ansible_module_compression' from source: unknown 40074 1727204640.96143: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 40074 1727204640.96185: variable 'ansible_facts' from source: unknown 40074 1727204640.96245: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319/AnsiballZ_service_facts.py 40074 1727204640.96374: Sending initial data 40074 1727204640.96378: Sent initial data (161 bytes) 40074 1727204640.96874: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204640.96877: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204640.96880: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204640.96883: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204640.96948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204640.96952: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204640.96955: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204640.96988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204640.98682: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 40074 1727204640.98693: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204640.98718: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204640.98756: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpju21szvm /root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319/AnsiballZ_service_facts.py <<< 40074 1727204640.98760: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319/AnsiballZ_service_facts.py" <<< 40074 1727204640.98793: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpju21szvm" to remote "/root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319/AnsiballZ_service_facts.py" <<< 40074 1727204640.99607: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204640.99662: stderr chunk (state=3): >>><<< 40074 1727204640.99667: stdout chunk (state=3): >>><<< 40074 1727204640.99690: done transferring module to remote 40074 1727204640.99701: _low_level_execute_command(): starting 40074 1727204640.99707: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319/ /root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319/AnsiballZ_service_facts.py && sleep 0' 40074 1727204641.00149: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204641.00183: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204641.00186: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204641.00197: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204641.00202: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204641.00204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204641.00261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204641.00264: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204641.00302: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204641.02258: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204641.02313: stderr chunk (state=3): >>><<< 40074 1727204641.02316: stdout chunk (state=3): >>><<< 40074 1727204641.02333: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204641.02336: _low_level_execute_command(): starting 40074 1727204641.02342: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319/AnsiballZ_service_facts.py && sleep 0' 40074 1727204641.02794: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204641.02840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204641.02843: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204641.02846: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204641.02848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204641.02850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204641.02898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204641.02901: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204641.02957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204643.12274: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"n<<< 40074 1727204643.12293: stdout chunk (state=3): >>>ame": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "seria<<< 40074 1727204643.12312: stdout chunk (state=3): >>>l-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service"<<< 40074 1727204643.12320: stdout chunk (state=3): >>>, "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name<<< 40074 1727204643.12338: stdout chunk (state=3): >>>": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inact<<< 40074 1727204643.12360: stdout chunk (state=3): >>>ive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 40074 1727204643.13983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204643.14043: stderr chunk (state=3): >>><<< 40074 1727204643.14047: stdout chunk (state=3): >>><<< 40074 1727204643.14084: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "nmstate.service": {"name": "nmstate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204643.14764: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204643.14775: _low_level_execute_command(): starting 40074 1727204643.14781: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204640.931602-41498-236312438164319/ > /dev/null 2>&1 && sleep 0' 40074 1727204643.15253: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204643.15291: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204643.15295: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204643.15297: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204643.15302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204643.15305: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204643.15359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204643.15366: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204643.15369: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204643.15412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204643.17412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204643.17471: stderr chunk (state=3): >>><<< 40074 1727204643.17474: stdout chunk (state=3): >>><<< 40074 1727204643.17491: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204643.17499: handler run complete 40074 1727204643.17673: variable 'ansible_facts' from source: unknown 40074 1727204643.18010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204643.18669: variable 'ansible_facts' from source: unknown 40074 1727204643.18898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204643.19104: attempt loop complete, returning result 40074 1727204643.19111: _execute() done 40074 1727204643.19114: dumping result to json 40074 1727204643.19166: done dumping result, returning 40074 1727204643.19176: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-9fd7-2501-0000000006ac] 40074 1727204643.19181: sending task result for task 12b410aa-8751-9fd7-2501-0000000006ac ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204643.20141: no more pending results, returning what we have 40074 1727204643.20145: results queue empty 40074 1727204643.20146: checking for any_errors_fatal 40074 1727204643.20150: done checking for any_errors_fatal 40074 1727204643.20151: checking for max_fail_percentage 40074 1727204643.20153: done checking for max_fail_percentage 40074 1727204643.20154: checking to see if all hosts have failed and the running result is not ok 40074 1727204643.20155: done checking to see if all hosts have failed 40074 1727204643.20156: getting the remaining hosts for this loop 40074 1727204643.20157: done getting the remaining hosts for this loop 40074 1727204643.20160: getting the next task for host managed-node2 40074 1727204643.20166: done getting next task for host managed-node2 40074 1727204643.20168: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 40074 1727204643.20173: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204643.20183: done sending task result for task 12b410aa-8751-9fd7-2501-0000000006ac 40074 1727204643.20186: WORKER PROCESS EXITING 40074 1727204643.20195: getting variables 40074 1727204643.20196: in VariableManager get_vars() 40074 1727204643.20232: Calling all_inventory to load vars for managed-node2 40074 1727204643.20235: Calling groups_inventory to load vars for managed-node2 40074 1727204643.20236: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204643.20244: Calling all_plugins_play to load vars for managed-node2 40074 1727204643.20247: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204643.20249: Calling groups_plugins_play to load vars for managed-node2 40074 1727204643.22045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204643.25178: done with get_vars() 40074 1727204643.25237: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 15:04:03 -0400 (0:00:02.367) 0:00:37.015 ***** 40074 1727204643.25357: entering _queue_task() for managed-node2/package_facts 40074 1727204643.25765: worker is 1 (out of 1 available) 40074 1727204643.25779: exiting _queue_task() for managed-node2/package_facts 40074 1727204643.26006: done queuing things up, now waiting for results queue to drain 40074 1727204643.26009: waiting for pending results... 40074 1727204643.26210: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 40074 1727204643.26376: in run() - task 12b410aa-8751-9fd7-2501-0000000006ad 40074 1727204643.26404: variable 'ansible_search_path' from source: unknown 40074 1727204643.26413: variable 'ansible_search_path' from source: unknown 40074 1727204643.26469: calling self._execute() 40074 1727204643.26591: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204643.26605: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204643.26623: variable 'omit' from source: magic vars 40074 1727204643.27195: variable 'ansible_distribution_major_version' from source: facts 40074 1727204643.27198: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204643.27201: variable 'omit' from source: magic vars 40074 1727204643.27277: variable 'omit' from source: magic vars 40074 1727204643.27336: variable 'omit' from source: magic vars 40074 1727204643.27397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204643.27457: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204643.27487: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204643.27523: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204643.27549: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204643.27654: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204643.27658: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204643.27661: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204643.27766: Set connection var ansible_pipelining to False 40074 1727204643.27781: Set connection var ansible_shell_executable to /bin/sh 40074 1727204643.27791: Set connection var ansible_shell_type to sh 40074 1727204643.27801: Set connection var ansible_connection to ssh 40074 1727204643.27816: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204643.27832: Set connection var ansible_timeout to 10 40074 1727204643.27874: variable 'ansible_shell_executable' from source: unknown 40074 1727204643.27884: variable 'ansible_connection' from source: unknown 40074 1727204643.27896: variable 'ansible_module_compression' from source: unknown 40074 1727204643.27980: variable 'ansible_shell_type' from source: unknown 40074 1727204643.27984: variable 'ansible_shell_executable' from source: unknown 40074 1727204643.27987: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204643.27991: variable 'ansible_pipelining' from source: unknown 40074 1727204643.27995: variable 'ansible_timeout' from source: unknown 40074 1727204643.27997: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204643.28203: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204643.28227: variable 'omit' from source: magic vars 40074 1727204643.28240: starting attempt loop 40074 1727204643.28248: running the handler 40074 1727204643.28270: _low_level_execute_command(): starting 40074 1727204643.28283: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204643.29088: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204643.29105: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204643.29228: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204643.29286: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204643.29335: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204643.31429: stdout chunk (state=3): >>>/root <<< 40074 1727204643.31446: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204643.31476: stdout chunk (state=3): >>><<< 40074 1727204643.31479: stderr chunk (state=3): >>><<< 40074 1727204643.31503: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204643.31629: _low_level_execute_command(): starting 40074 1727204643.31633: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799 `" && echo ansible-tmp-1727204643.31511-41795-197259959900799="` echo /root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799 `" ) && sleep 0' 40074 1727204643.32205: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204643.32224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204643.32243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204643.32266: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204643.32285: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204643.32394: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204643.32427: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204643.32613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204643.34559: stdout chunk (state=3): >>>ansible-tmp-1727204643.31511-41795-197259959900799=/root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799 <<< 40074 1727204643.34804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204643.34827: stderr chunk (state=3): >>><<< 40074 1727204643.34838: stdout chunk (state=3): >>><<< 40074 1727204643.34867: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204643.31511-41795-197259959900799=/root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204643.34949: variable 'ansible_module_compression' from source: unknown 40074 1727204643.35024: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 40074 1727204643.35102: variable 'ansible_facts' from source: unknown 40074 1727204643.35343: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799/AnsiballZ_package_facts.py 40074 1727204643.35596: Sending initial data 40074 1727204643.35607: Sent initial data (160 bytes) 40074 1727204643.36351: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204643.36436: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204643.36465: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204643.36583: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204643.38280: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204643.38432: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204643.38506: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpsdulkc9u /root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799/AnsiballZ_package_facts.py <<< 40074 1727204643.38521: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799/AnsiballZ_package_facts.py" <<< 40074 1727204643.38526: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpsdulkc9u" to remote "/root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799/AnsiballZ_package_facts.py" <<< 40074 1727204643.41171: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204643.41175: stderr chunk (state=3): >>><<< 40074 1727204643.41178: stdout chunk (state=3): >>><<< 40074 1727204643.41180: done transferring module to remote 40074 1727204643.41183: _low_level_execute_command(): starting 40074 1727204643.41186: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799/ /root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799/AnsiballZ_package_facts.py && sleep 0' 40074 1727204643.41839: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204643.41844: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204643.41847: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204643.41850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204643.41902: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204643.41937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204643.44011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204643.44050: stdout chunk (state=3): >>><<< 40074 1727204643.44053: stderr chunk (state=3): >>><<< 40074 1727204643.44071: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204643.44178: _low_level_execute_command(): starting 40074 1727204643.44182: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799/AnsiballZ_package_facts.py && sleep 0' 40074 1727204643.44812: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204643.44838: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204643.44854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204643.44878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204643.44958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204643.45000: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204643.45015: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204643.45039: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204643.45251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204644.09875: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 40074 1727204644.10067: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 40074 1727204644.10088: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", <<< 40074 1727204644.10133: stdout chunk (state=3): >>>"source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 40074 1727204644.12093: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204644.12120: stderr chunk (state=3): >>><<< 40074 1727204644.12131: stdout chunk (state=3): >>><<< 40074 1727204644.12188: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.10", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "nmstate": [{"name": "nmstate", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-config-server": [{"name": "NetworkManager-config-server", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "nmstate-libs": [{"name": "nmstate-libs", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libnmstate": [{"name": "python3-libnmstate", "version": "2.2.34", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204644.16926: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204644.16963: _low_level_execute_command(): starting 40074 1727204644.16974: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204643.31511-41795-197259959900799/ > /dev/null 2>&1 && sleep 0' 40074 1727204644.17678: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204644.17795: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204644.17862: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204644.17897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204644.20045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204644.20050: stdout chunk (state=3): >>><<< 40074 1727204644.20052: stderr chunk (state=3): >>><<< 40074 1727204644.20079: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204644.20098: handler run complete 40074 1727204644.21898: variable 'ansible_facts' from source: unknown 40074 1727204644.22830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204644.26086: variable 'ansible_facts' from source: unknown 40074 1727204644.26534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204644.27739: attempt loop complete, returning result 40074 1727204644.27743: _execute() done 40074 1727204644.27746: dumping result to json 40074 1727204644.28048: done dumping result, returning 40074 1727204644.28063: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-9fd7-2501-0000000006ad] 40074 1727204644.28067: sending task result for task 12b410aa-8751-9fd7-2501-0000000006ad 40074 1727204644.30782: done sending task result for task 12b410aa-8751-9fd7-2501-0000000006ad 40074 1727204644.30786: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204644.30896: no more pending results, returning what we have 40074 1727204644.30899: results queue empty 40074 1727204644.30899: checking for any_errors_fatal 40074 1727204644.30904: done checking for any_errors_fatal 40074 1727204644.30904: checking for max_fail_percentage 40074 1727204644.30905: done checking for max_fail_percentage 40074 1727204644.30906: checking to see if all hosts have failed and the running result is not ok 40074 1727204644.30907: done checking to see if all hosts have failed 40074 1727204644.30908: getting the remaining hosts for this loop 40074 1727204644.30908: done getting the remaining hosts for this loop 40074 1727204644.30911: getting the next task for host managed-node2 40074 1727204644.30919: done getting next task for host managed-node2 40074 1727204644.30922: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 40074 1727204644.30925: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204644.30934: getting variables 40074 1727204644.30935: in VariableManager get_vars() 40074 1727204644.30964: Calling all_inventory to load vars for managed-node2 40074 1727204644.30967: Calling groups_inventory to load vars for managed-node2 40074 1727204644.30968: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204644.30976: Calling all_plugins_play to load vars for managed-node2 40074 1727204644.30978: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204644.30980: Calling groups_plugins_play to load vars for managed-node2 40074 1727204644.32651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204644.34278: done with get_vars() 40074 1727204644.34304: done getting variables 40074 1727204644.34357: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 15:04:04 -0400 (0:00:01.090) 0:00:38.105 ***** 40074 1727204644.34391: entering _queue_task() for managed-node2/debug 40074 1727204644.34653: worker is 1 (out of 1 available) 40074 1727204644.34669: exiting _queue_task() for managed-node2/debug 40074 1727204644.34683: done queuing things up, now waiting for results queue to drain 40074 1727204644.34685: waiting for pending results... 40074 1727204644.34886: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 40074 1727204644.34996: in run() - task 12b410aa-8751-9fd7-2501-000000000642 40074 1727204644.35011: variable 'ansible_search_path' from source: unknown 40074 1727204644.35015: variable 'ansible_search_path' from source: unknown 40074 1727204644.35055: calling self._execute() 40074 1727204644.35150: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204644.35158: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204644.35169: variable 'omit' from source: magic vars 40074 1727204644.35502: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.35513: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204644.35523: variable 'omit' from source: magic vars 40074 1727204644.35576: variable 'omit' from source: magic vars 40074 1727204644.35655: variable 'network_provider' from source: set_fact 40074 1727204644.35671: variable 'omit' from source: magic vars 40074 1727204644.35711: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204644.35745: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204644.35764: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204644.35781: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204644.35798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204644.35829: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204644.35832: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204644.35835: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204644.35928: Set connection var ansible_pipelining to False 40074 1727204644.35935: Set connection var ansible_shell_executable to /bin/sh 40074 1727204644.35938: Set connection var ansible_shell_type to sh 40074 1727204644.35941: Set connection var ansible_connection to ssh 40074 1727204644.35949: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204644.35956: Set connection var ansible_timeout to 10 40074 1727204644.35979: variable 'ansible_shell_executable' from source: unknown 40074 1727204644.35983: variable 'ansible_connection' from source: unknown 40074 1727204644.35985: variable 'ansible_module_compression' from source: unknown 40074 1727204644.35988: variable 'ansible_shell_type' from source: unknown 40074 1727204644.35993: variable 'ansible_shell_executable' from source: unknown 40074 1727204644.35997: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204644.36008: variable 'ansible_pipelining' from source: unknown 40074 1727204644.36012: variable 'ansible_timeout' from source: unknown 40074 1727204644.36015: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204644.36135: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204644.36146: variable 'omit' from source: magic vars 40074 1727204644.36152: starting attempt loop 40074 1727204644.36156: running the handler 40074 1727204644.36198: handler run complete 40074 1727204644.36211: attempt loop complete, returning result 40074 1727204644.36215: _execute() done 40074 1727204644.36223: dumping result to json 40074 1727204644.36226: done dumping result, returning 40074 1727204644.36236: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-9fd7-2501-000000000642] 40074 1727204644.36238: sending task result for task 12b410aa-8751-9fd7-2501-000000000642 40074 1727204644.36328: done sending task result for task 12b410aa-8751-9fd7-2501-000000000642 40074 1727204644.36331: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 40074 1727204644.36409: no more pending results, returning what we have 40074 1727204644.36413: results queue empty 40074 1727204644.36414: checking for any_errors_fatal 40074 1727204644.36426: done checking for any_errors_fatal 40074 1727204644.36427: checking for max_fail_percentage 40074 1727204644.36428: done checking for max_fail_percentage 40074 1727204644.36430: checking to see if all hosts have failed and the running result is not ok 40074 1727204644.36431: done checking to see if all hosts have failed 40074 1727204644.36432: getting the remaining hosts for this loop 40074 1727204644.36433: done getting the remaining hosts for this loop 40074 1727204644.36438: getting the next task for host managed-node2 40074 1727204644.36447: done getting next task for host managed-node2 40074 1727204644.36451: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 40074 1727204644.36455: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204644.36468: getting variables 40074 1727204644.36469: in VariableManager get_vars() 40074 1727204644.36516: Calling all_inventory to load vars for managed-node2 40074 1727204644.36519: Calling groups_inventory to load vars for managed-node2 40074 1727204644.36522: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204644.36532: Calling all_plugins_play to load vars for managed-node2 40074 1727204644.36535: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204644.36538: Calling groups_plugins_play to load vars for managed-node2 40074 1727204644.37837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204644.39440: done with get_vars() 40074 1727204644.39461: done getting variables 40074 1727204644.39512: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 15:04:04 -0400 (0:00:00.051) 0:00:38.157 ***** 40074 1727204644.39541: entering _queue_task() for managed-node2/fail 40074 1727204644.39781: worker is 1 (out of 1 available) 40074 1727204644.39798: exiting _queue_task() for managed-node2/fail 40074 1727204644.39813: done queuing things up, now waiting for results queue to drain 40074 1727204644.39814: waiting for pending results... 40074 1727204644.40006: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 40074 1727204644.40111: in run() - task 12b410aa-8751-9fd7-2501-000000000643 40074 1727204644.40127: variable 'ansible_search_path' from source: unknown 40074 1727204644.40130: variable 'ansible_search_path' from source: unknown 40074 1727204644.40165: calling self._execute() 40074 1727204644.40255: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204644.40259: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204644.40272: variable 'omit' from source: magic vars 40074 1727204644.40603: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.40609: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204644.40715: variable 'network_state' from source: role '' defaults 40074 1727204644.40728: Evaluated conditional (network_state != {}): False 40074 1727204644.40732: when evaluation is False, skipping this task 40074 1727204644.40735: _execute() done 40074 1727204644.40740: dumping result to json 40074 1727204644.40745: done dumping result, returning 40074 1727204644.40753: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-9fd7-2501-000000000643] 40074 1727204644.40758: sending task result for task 12b410aa-8751-9fd7-2501-000000000643 40074 1727204644.40852: done sending task result for task 12b410aa-8751-9fd7-2501-000000000643 40074 1727204644.40855: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204644.40913: no more pending results, returning what we have 40074 1727204644.40917: results queue empty 40074 1727204644.40918: checking for any_errors_fatal 40074 1727204644.40924: done checking for any_errors_fatal 40074 1727204644.40925: checking for max_fail_percentage 40074 1727204644.40927: done checking for max_fail_percentage 40074 1727204644.40928: checking to see if all hosts have failed and the running result is not ok 40074 1727204644.40930: done checking to see if all hosts have failed 40074 1727204644.40930: getting the remaining hosts for this loop 40074 1727204644.40932: done getting the remaining hosts for this loop 40074 1727204644.40937: getting the next task for host managed-node2 40074 1727204644.40944: done getting next task for host managed-node2 40074 1727204644.40947: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 40074 1727204644.40952: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204644.40973: getting variables 40074 1727204644.40975: in VariableManager get_vars() 40074 1727204644.41014: Calling all_inventory to load vars for managed-node2 40074 1727204644.41017: Calling groups_inventory to load vars for managed-node2 40074 1727204644.41020: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204644.41030: Calling all_plugins_play to load vars for managed-node2 40074 1727204644.41033: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204644.41037: Calling groups_plugins_play to load vars for managed-node2 40074 1727204644.42221: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204644.43923: done with get_vars() 40074 1727204644.43944: done getting variables 40074 1727204644.43996: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 15:04:04 -0400 (0:00:00.044) 0:00:38.202 ***** 40074 1727204644.44023: entering _queue_task() for managed-node2/fail 40074 1727204644.44249: worker is 1 (out of 1 available) 40074 1727204644.44265: exiting _queue_task() for managed-node2/fail 40074 1727204644.44278: done queuing things up, now waiting for results queue to drain 40074 1727204644.44280: waiting for pending results... 40074 1727204644.44469: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 40074 1727204644.44569: in run() - task 12b410aa-8751-9fd7-2501-000000000644 40074 1727204644.44581: variable 'ansible_search_path' from source: unknown 40074 1727204644.44585: variable 'ansible_search_path' from source: unknown 40074 1727204644.44627: calling self._execute() 40074 1727204644.44713: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204644.44722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204644.44735: variable 'omit' from source: magic vars 40074 1727204644.45053: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.45064: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204644.45172: variable 'network_state' from source: role '' defaults 40074 1727204644.45178: Evaluated conditional (network_state != {}): False 40074 1727204644.45181: when evaluation is False, skipping this task 40074 1727204644.45187: _execute() done 40074 1727204644.45192: dumping result to json 40074 1727204644.45197: done dumping result, returning 40074 1727204644.45205: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-9fd7-2501-000000000644] 40074 1727204644.45211: sending task result for task 12b410aa-8751-9fd7-2501-000000000644 40074 1727204644.45315: done sending task result for task 12b410aa-8751-9fd7-2501-000000000644 40074 1727204644.45318: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204644.45368: no more pending results, returning what we have 40074 1727204644.45372: results queue empty 40074 1727204644.45373: checking for any_errors_fatal 40074 1727204644.45378: done checking for any_errors_fatal 40074 1727204644.45379: checking for max_fail_percentage 40074 1727204644.45382: done checking for max_fail_percentage 40074 1727204644.45383: checking to see if all hosts have failed and the running result is not ok 40074 1727204644.45384: done checking to see if all hosts have failed 40074 1727204644.45385: getting the remaining hosts for this loop 40074 1727204644.45387: done getting the remaining hosts for this loop 40074 1727204644.45391: getting the next task for host managed-node2 40074 1727204644.45399: done getting next task for host managed-node2 40074 1727204644.45403: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 40074 1727204644.45407: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204644.45426: getting variables 40074 1727204644.45429: in VariableManager get_vars() 40074 1727204644.45467: Calling all_inventory to load vars for managed-node2 40074 1727204644.45470: Calling groups_inventory to load vars for managed-node2 40074 1727204644.45472: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204644.45486: Calling all_plugins_play to load vars for managed-node2 40074 1727204644.45490: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204644.45493: Calling groups_plugins_play to load vars for managed-node2 40074 1727204644.46675: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204644.48273: done with get_vars() 40074 1727204644.48296: done getting variables 40074 1727204644.48345: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 15:04:04 -0400 (0:00:00.043) 0:00:38.245 ***** 40074 1727204644.48371: entering _queue_task() for managed-node2/fail 40074 1727204644.48598: worker is 1 (out of 1 available) 40074 1727204644.48614: exiting _queue_task() for managed-node2/fail 40074 1727204644.48627: done queuing things up, now waiting for results queue to drain 40074 1727204644.48629: waiting for pending results... 40074 1727204644.48824: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 40074 1727204644.48938: in run() - task 12b410aa-8751-9fd7-2501-000000000645 40074 1727204644.48951: variable 'ansible_search_path' from source: unknown 40074 1727204644.48955: variable 'ansible_search_path' from source: unknown 40074 1727204644.48993: calling self._execute() 40074 1727204644.49079: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204644.49084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204644.49095: variable 'omit' from source: magic vars 40074 1727204644.49413: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.49426: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204644.49578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204644.52897: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204644.52902: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204644.52905: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204644.52940: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204644.52980: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204644.53087: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.53137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.53184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.53255: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.53282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.53422: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.53446: Evaluated conditional (ansible_distribution_major_version | int > 9): True 40074 1727204644.53681: variable 'ansible_distribution' from source: facts 40074 1727204644.53694: variable '__network_rh_distros' from source: role '' defaults 40074 1727204644.53700: Evaluated conditional (ansible_distribution in __network_rh_distros): False 40074 1727204644.53703: when evaluation is False, skipping this task 40074 1727204644.53705: _execute() done 40074 1727204644.53708: dumping result to json 40074 1727204644.53711: done dumping result, returning 40074 1727204644.53716: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-9fd7-2501-000000000645] 40074 1727204644.53729: sending task result for task 12b410aa-8751-9fd7-2501-000000000645 40074 1727204644.53869: done sending task result for task 12b410aa-8751-9fd7-2501-000000000645 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 40074 1727204644.54039: no more pending results, returning what we have 40074 1727204644.54044: results queue empty 40074 1727204644.54045: checking for any_errors_fatal 40074 1727204644.54055: done checking for any_errors_fatal 40074 1727204644.54056: checking for max_fail_percentage 40074 1727204644.54059: done checking for max_fail_percentage 40074 1727204644.54060: checking to see if all hosts have failed and the running result is not ok 40074 1727204644.54061: done checking to see if all hosts have failed 40074 1727204644.54062: getting the remaining hosts for this loop 40074 1727204644.54064: done getting the remaining hosts for this loop 40074 1727204644.54069: getting the next task for host managed-node2 40074 1727204644.54078: done getting next task for host managed-node2 40074 1727204644.54083: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 40074 1727204644.54088: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204644.54111: WORKER PROCESS EXITING 40074 1727204644.54236: getting variables 40074 1727204644.54239: in VariableManager get_vars() 40074 1727204644.54299: Calling all_inventory to load vars for managed-node2 40074 1727204644.54304: Calling groups_inventory to load vars for managed-node2 40074 1727204644.54307: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204644.54320: Calling all_plugins_play to load vars for managed-node2 40074 1727204644.54394: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204644.54406: Calling groups_plugins_play to load vars for managed-node2 40074 1727204644.56022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204644.57616: done with get_vars() 40074 1727204644.57640: done getting variables 40074 1727204644.57696: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 15:04:04 -0400 (0:00:00.093) 0:00:38.339 ***** 40074 1727204644.57724: entering _queue_task() for managed-node2/dnf 40074 1727204644.57994: worker is 1 (out of 1 available) 40074 1727204644.58009: exiting _queue_task() for managed-node2/dnf 40074 1727204644.58022: done queuing things up, now waiting for results queue to drain 40074 1727204644.58024: waiting for pending results... 40074 1727204644.58237: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 40074 1727204644.58351: in run() - task 12b410aa-8751-9fd7-2501-000000000646 40074 1727204644.58367: variable 'ansible_search_path' from source: unknown 40074 1727204644.58371: variable 'ansible_search_path' from source: unknown 40074 1727204644.58403: calling self._execute() 40074 1727204644.58495: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204644.58502: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204644.58513: variable 'omit' from source: magic vars 40074 1727204644.58854: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.58864: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204644.59050: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204644.60885: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204644.60942: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204644.60975: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204644.61008: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204644.61034: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204644.61108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.61144: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.61167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.61206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.61217: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.61318: variable 'ansible_distribution' from source: facts 40074 1727204644.61325: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.61333: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 40074 1727204644.61431: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204644.61546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.61567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.61587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.61624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.61640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.61673: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.61694: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.61714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.61752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.61764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.61799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.61819: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.61842: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.61876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.61888: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.62019: variable 'network_connections' from source: include params 40074 1727204644.62033: variable 'interface0' from source: play vars 40074 1727204644.62096: variable 'interface0' from source: play vars 40074 1727204644.62107: variable 'interface1' from source: play vars 40074 1727204644.62159: variable 'interface1' from source: play vars 40074 1727204644.62224: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204644.62357: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204644.62389: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204644.62419: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204644.62446: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204644.62482: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204644.62509: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204644.62533: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.62554: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204644.62607: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204644.62825: variable 'network_connections' from source: include params 40074 1727204644.62836: variable 'interface0' from source: play vars 40074 1727204644.62883: variable 'interface0' from source: play vars 40074 1727204644.62892: variable 'interface1' from source: play vars 40074 1727204644.62995: variable 'interface1' from source: play vars 40074 1727204644.62999: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 40074 1727204644.63002: when evaluation is False, skipping this task 40074 1727204644.63004: _execute() done 40074 1727204644.63006: dumping result to json 40074 1727204644.63008: done dumping result, returning 40074 1727204644.63011: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-9fd7-2501-000000000646] 40074 1727204644.63013: sending task result for task 12b410aa-8751-9fd7-2501-000000000646 40074 1727204644.63088: done sending task result for task 12b410aa-8751-9fd7-2501-000000000646 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 40074 1727204644.63153: no more pending results, returning what we have 40074 1727204644.63157: results queue empty 40074 1727204644.63158: checking for any_errors_fatal 40074 1727204644.63168: done checking for any_errors_fatal 40074 1727204644.63169: checking for max_fail_percentage 40074 1727204644.63171: done checking for max_fail_percentage 40074 1727204644.63172: checking to see if all hosts have failed and the running result is not ok 40074 1727204644.63173: done checking to see if all hosts have failed 40074 1727204644.63174: getting the remaining hosts for this loop 40074 1727204644.63175: done getting the remaining hosts for this loop 40074 1727204644.63180: getting the next task for host managed-node2 40074 1727204644.63188: done getting next task for host managed-node2 40074 1727204644.63194: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 40074 1727204644.63198: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204644.63223: getting variables 40074 1727204644.63225: in VariableManager get_vars() 40074 1727204644.63270: Calling all_inventory to load vars for managed-node2 40074 1727204644.63273: Calling groups_inventory to load vars for managed-node2 40074 1727204644.63275: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204644.63286: Calling all_plugins_play to load vars for managed-node2 40074 1727204644.63298: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204644.63304: WORKER PROCESS EXITING 40074 1727204644.63309: Calling groups_plugins_play to load vars for managed-node2 40074 1727204644.64563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204644.66260: done with get_vars() 40074 1727204644.66283: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 40074 1727204644.66350: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 15:04:04 -0400 (0:00:00.086) 0:00:38.425 ***** 40074 1727204644.66376: entering _queue_task() for managed-node2/yum 40074 1727204644.66638: worker is 1 (out of 1 available) 40074 1727204644.66653: exiting _queue_task() for managed-node2/yum 40074 1727204644.66668: done queuing things up, now waiting for results queue to drain 40074 1727204644.66670: waiting for pending results... 40074 1727204644.66872: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 40074 1727204644.66976: in run() - task 12b410aa-8751-9fd7-2501-000000000647 40074 1727204644.66993: variable 'ansible_search_path' from source: unknown 40074 1727204644.66997: variable 'ansible_search_path' from source: unknown 40074 1727204644.67033: calling self._execute() 40074 1727204644.67120: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204644.67128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204644.67138: variable 'omit' from source: magic vars 40074 1727204644.67468: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.67478: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204644.67635: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204644.69447: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204644.69498: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204644.69536: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204644.69566: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204644.69590: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204644.69664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.69698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.69723: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.69761: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.69773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.69857: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.69871: Evaluated conditional (ansible_distribution_major_version | int < 8): False 40074 1727204644.69874: when evaluation is False, skipping this task 40074 1727204644.69880: _execute() done 40074 1727204644.69886: dumping result to json 40074 1727204644.69892: done dumping result, returning 40074 1727204644.69900: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-9fd7-2501-000000000647] 40074 1727204644.69905: sending task result for task 12b410aa-8751-9fd7-2501-000000000647 40074 1727204644.70007: done sending task result for task 12b410aa-8751-9fd7-2501-000000000647 40074 1727204644.70010: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 40074 1727204644.70065: no more pending results, returning what we have 40074 1727204644.70069: results queue empty 40074 1727204644.70070: checking for any_errors_fatal 40074 1727204644.70078: done checking for any_errors_fatal 40074 1727204644.70079: checking for max_fail_percentage 40074 1727204644.70080: done checking for max_fail_percentage 40074 1727204644.70081: checking to see if all hosts have failed and the running result is not ok 40074 1727204644.70082: done checking to see if all hosts have failed 40074 1727204644.70083: getting the remaining hosts for this loop 40074 1727204644.70084: done getting the remaining hosts for this loop 40074 1727204644.70095: getting the next task for host managed-node2 40074 1727204644.70104: done getting next task for host managed-node2 40074 1727204644.70109: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 40074 1727204644.70113: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204644.70138: getting variables 40074 1727204644.70140: in VariableManager get_vars() 40074 1727204644.70183: Calling all_inventory to load vars for managed-node2 40074 1727204644.70187: Calling groups_inventory to load vars for managed-node2 40074 1727204644.70194: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204644.70209: Calling all_plugins_play to load vars for managed-node2 40074 1727204644.70213: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204644.70216: Calling groups_plugins_play to load vars for managed-node2 40074 1727204644.71462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204644.73076: done with get_vars() 40074 1727204644.73107: done getting variables 40074 1727204644.73166: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 15:04:04 -0400 (0:00:00.068) 0:00:38.493 ***** 40074 1727204644.73198: entering _queue_task() for managed-node2/fail 40074 1727204644.73483: worker is 1 (out of 1 available) 40074 1727204644.73499: exiting _queue_task() for managed-node2/fail 40074 1727204644.73514: done queuing things up, now waiting for results queue to drain 40074 1727204644.73516: waiting for pending results... 40074 1727204644.73720: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 40074 1727204644.73828: in run() - task 12b410aa-8751-9fd7-2501-000000000648 40074 1727204644.73841: variable 'ansible_search_path' from source: unknown 40074 1727204644.73844: variable 'ansible_search_path' from source: unknown 40074 1727204644.73881: calling self._execute() 40074 1727204644.73975: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204644.73983: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204644.73995: variable 'omit' from source: magic vars 40074 1727204644.74333: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.74344: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204644.74451: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204644.74633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204644.76763: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204644.76819: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204644.76853: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204644.76881: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204644.76908: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204644.76982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.77007: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.77036: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.77068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.77081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.77127: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.77149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.77170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.77202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.77215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.77256: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.77275: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.77297: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.77330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.77343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.77491: variable 'network_connections' from source: include params 40074 1727204644.77502: variable 'interface0' from source: play vars 40074 1727204644.77574: variable 'interface0' from source: play vars 40074 1727204644.77584: variable 'interface1' from source: play vars 40074 1727204644.77640: variable 'interface1' from source: play vars 40074 1727204644.77709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204644.77873: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204644.77909: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204644.77939: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204644.77964: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204644.78006: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204644.78029: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204644.78050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.78071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204644.78120: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204644.78328: variable 'network_connections' from source: include params 40074 1727204644.78333: variable 'interface0' from source: play vars 40074 1727204644.78388: variable 'interface0' from source: play vars 40074 1727204644.78397: variable 'interface1' from source: play vars 40074 1727204644.78456: variable 'interface1' from source: play vars 40074 1727204644.78474: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 40074 1727204644.78478: when evaluation is False, skipping this task 40074 1727204644.78481: _execute() done 40074 1727204644.78486: dumping result to json 40074 1727204644.78492: done dumping result, returning 40074 1727204644.78500: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-9fd7-2501-000000000648] 40074 1727204644.78511: sending task result for task 12b410aa-8751-9fd7-2501-000000000648 40074 1727204644.78609: done sending task result for task 12b410aa-8751-9fd7-2501-000000000648 40074 1727204644.78612: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 40074 1727204644.78674: no more pending results, returning what we have 40074 1727204644.78678: results queue empty 40074 1727204644.78679: checking for any_errors_fatal 40074 1727204644.78686: done checking for any_errors_fatal 40074 1727204644.78686: checking for max_fail_percentage 40074 1727204644.78688: done checking for max_fail_percentage 40074 1727204644.78691: checking to see if all hosts have failed and the running result is not ok 40074 1727204644.78693: done checking to see if all hosts have failed 40074 1727204644.78693: getting the remaining hosts for this loop 40074 1727204644.78695: done getting the remaining hosts for this loop 40074 1727204644.78700: getting the next task for host managed-node2 40074 1727204644.78708: done getting next task for host managed-node2 40074 1727204644.78712: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 40074 1727204644.78716: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204644.78747: getting variables 40074 1727204644.78750: in VariableManager get_vars() 40074 1727204644.78798: Calling all_inventory to load vars for managed-node2 40074 1727204644.78802: Calling groups_inventory to load vars for managed-node2 40074 1727204644.78804: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204644.78815: Calling all_plugins_play to load vars for managed-node2 40074 1727204644.78818: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204644.78822: Calling groups_plugins_play to load vars for managed-node2 40074 1727204644.80250: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204644.81846: done with get_vars() 40074 1727204644.81875: done getting variables 40074 1727204644.81932: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 15:04:04 -0400 (0:00:00.087) 0:00:38.581 ***** 40074 1727204644.81962: entering _queue_task() for managed-node2/package 40074 1727204644.82243: worker is 1 (out of 1 available) 40074 1727204644.82257: exiting _queue_task() for managed-node2/package 40074 1727204644.82271: done queuing things up, now waiting for results queue to drain 40074 1727204644.82273: waiting for pending results... 40074 1727204644.82478: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 40074 1727204644.82588: in run() - task 12b410aa-8751-9fd7-2501-000000000649 40074 1727204644.82604: variable 'ansible_search_path' from source: unknown 40074 1727204644.82608: variable 'ansible_search_path' from source: unknown 40074 1727204644.82643: calling self._execute() 40074 1727204644.82732: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204644.82739: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204644.82749: variable 'omit' from source: magic vars 40074 1727204644.83088: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.83100: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204644.83279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204644.83504: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204644.83543: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204644.83574: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204644.83635: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204644.83737: variable 'network_packages' from source: role '' defaults 40074 1727204644.83830: variable '__network_provider_setup' from source: role '' defaults 40074 1727204644.83839: variable '__network_service_name_default_nm' from source: role '' defaults 40074 1727204644.83900: variable '__network_service_name_default_nm' from source: role '' defaults 40074 1727204644.83909: variable '__network_packages_default_nm' from source: role '' defaults 40074 1727204644.83964: variable '__network_packages_default_nm' from source: role '' defaults 40074 1727204644.84124: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204644.85720: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204644.85772: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204644.85807: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204644.85836: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204644.85859: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204644.85933: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.85957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.85980: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.86021: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.86033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.86071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.86092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.86120: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.86151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.86164: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.86357: variable '__network_packages_default_gobject_packages' from source: role '' defaults 40074 1727204644.86455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.86475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.86498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.86530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.86546: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.86625: variable 'ansible_python' from source: facts 40074 1727204644.86648: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 40074 1727204644.86722: variable '__network_wpa_supplicant_required' from source: role '' defaults 40074 1727204644.86791: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 40074 1727204644.86899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.86921: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.86940: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.86970: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.86984: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.87035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204644.87071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204644.87103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.87135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204644.87149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204644.87270: variable 'network_connections' from source: include params 40074 1727204644.87276: variable 'interface0' from source: play vars 40074 1727204644.87362: variable 'interface0' from source: play vars 40074 1727204644.87373: variable 'interface1' from source: play vars 40074 1727204644.87458: variable 'interface1' from source: play vars 40074 1727204644.87518: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204644.87547: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204644.87573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204644.87600: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204644.87646: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204644.87878: variable 'network_connections' from source: include params 40074 1727204644.87886: variable 'interface0' from source: play vars 40074 1727204644.87969: variable 'interface0' from source: play vars 40074 1727204644.87980: variable 'interface1' from source: play vars 40074 1727204644.88060: variable 'interface1' from source: play vars 40074 1727204644.88091: variable '__network_packages_default_wireless' from source: role '' defaults 40074 1727204644.88163: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204644.88422: variable 'network_connections' from source: include params 40074 1727204644.88429: variable 'interface0' from source: play vars 40074 1727204644.88482: variable 'interface0' from source: play vars 40074 1727204644.88491: variable 'interface1' from source: play vars 40074 1727204644.88549: variable 'interface1' from source: play vars 40074 1727204644.88570: variable '__network_packages_default_team' from source: role '' defaults 40074 1727204644.88641: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204644.88890: variable 'network_connections' from source: include params 40074 1727204644.88894: variable 'interface0' from source: play vars 40074 1727204644.88952: variable 'interface0' from source: play vars 40074 1727204644.88962: variable 'interface1' from source: play vars 40074 1727204644.89014: variable 'interface1' from source: play vars 40074 1727204644.89064: variable '__network_service_name_default_initscripts' from source: role '' defaults 40074 1727204644.89116: variable '__network_service_name_default_initscripts' from source: role '' defaults 40074 1727204644.89125: variable '__network_packages_default_initscripts' from source: role '' defaults 40074 1727204644.89179: variable '__network_packages_default_initscripts' from source: role '' defaults 40074 1727204644.89368: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 40074 1727204644.89766: variable 'network_connections' from source: include params 40074 1727204644.89770: variable 'interface0' from source: play vars 40074 1727204644.89827: variable 'interface0' from source: play vars 40074 1727204644.89836: variable 'interface1' from source: play vars 40074 1727204644.89884: variable 'interface1' from source: play vars 40074 1727204644.89893: variable 'ansible_distribution' from source: facts 40074 1727204644.89899: variable '__network_rh_distros' from source: role '' defaults 40074 1727204644.89906: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.89924: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 40074 1727204644.90065: variable 'ansible_distribution' from source: facts 40074 1727204644.90069: variable '__network_rh_distros' from source: role '' defaults 40074 1727204644.90075: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.90083: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 40074 1727204644.90226: variable 'ansible_distribution' from source: facts 40074 1727204644.90230: variable '__network_rh_distros' from source: role '' defaults 40074 1727204644.90236: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.90269: variable 'network_provider' from source: set_fact 40074 1727204644.90284: variable 'ansible_facts' from source: unknown 40074 1727204644.91118: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 40074 1727204644.91122: when evaluation is False, skipping this task 40074 1727204644.91125: _execute() done 40074 1727204644.91138: dumping result to json 40074 1727204644.91141: done dumping result, returning 40074 1727204644.91144: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-9fd7-2501-000000000649] 40074 1727204644.91151: sending task result for task 12b410aa-8751-9fd7-2501-000000000649 40074 1727204644.91260: done sending task result for task 12b410aa-8751-9fd7-2501-000000000649 40074 1727204644.91262: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 40074 1727204644.91320: no more pending results, returning what we have 40074 1727204644.91323: results queue empty 40074 1727204644.91324: checking for any_errors_fatal 40074 1727204644.91331: done checking for any_errors_fatal 40074 1727204644.91332: checking for max_fail_percentage 40074 1727204644.91334: done checking for max_fail_percentage 40074 1727204644.91335: checking to see if all hosts have failed and the running result is not ok 40074 1727204644.91336: done checking to see if all hosts have failed 40074 1727204644.91337: getting the remaining hosts for this loop 40074 1727204644.91338: done getting the remaining hosts for this loop 40074 1727204644.91343: getting the next task for host managed-node2 40074 1727204644.91353: done getting next task for host managed-node2 40074 1727204644.91357: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 40074 1727204644.91361: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204644.91391: getting variables 40074 1727204644.91394: in VariableManager get_vars() 40074 1727204644.91439: Calling all_inventory to load vars for managed-node2 40074 1727204644.91443: Calling groups_inventory to load vars for managed-node2 40074 1727204644.91445: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204644.91457: Calling all_plugins_play to load vars for managed-node2 40074 1727204644.91460: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204644.91463: Calling groups_plugins_play to load vars for managed-node2 40074 1727204644.92742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204644.94446: done with get_vars() 40074 1727204644.94469: done getting variables 40074 1727204644.94525: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 15:04:04 -0400 (0:00:00.125) 0:00:38.707 ***** 40074 1727204644.94555: entering _queue_task() for managed-node2/package 40074 1727204644.94836: worker is 1 (out of 1 available) 40074 1727204644.94852: exiting _queue_task() for managed-node2/package 40074 1727204644.94868: done queuing things up, now waiting for results queue to drain 40074 1727204644.94869: waiting for pending results... 40074 1727204644.95084: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 40074 1727204644.95192: in run() - task 12b410aa-8751-9fd7-2501-00000000064a 40074 1727204644.95210: variable 'ansible_search_path' from source: unknown 40074 1727204644.95215: variable 'ansible_search_path' from source: unknown 40074 1727204644.95246: calling self._execute() 40074 1727204644.95338: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204644.95345: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204644.95355: variable 'omit' from source: magic vars 40074 1727204644.95694: variable 'ansible_distribution_major_version' from source: facts 40074 1727204644.95705: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204644.95811: variable 'network_state' from source: role '' defaults 40074 1727204644.95866: Evaluated conditional (network_state != {}): False 40074 1727204644.95872: when evaluation is False, skipping this task 40074 1727204644.95875: _execute() done 40074 1727204644.95878: dumping result to json 40074 1727204644.95880: done dumping result, returning 40074 1727204644.95883: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-9fd7-2501-00000000064a] 40074 1727204644.95886: sending task result for task 12b410aa-8751-9fd7-2501-00000000064a 40074 1727204644.95961: done sending task result for task 12b410aa-8751-9fd7-2501-00000000064a 40074 1727204644.95964: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204644.96028: no more pending results, returning what we have 40074 1727204644.96031: results queue empty 40074 1727204644.96032: checking for any_errors_fatal 40074 1727204644.96040: done checking for any_errors_fatal 40074 1727204644.96041: checking for max_fail_percentage 40074 1727204644.96043: done checking for max_fail_percentage 40074 1727204644.96044: checking to see if all hosts have failed and the running result is not ok 40074 1727204644.96045: done checking to see if all hosts have failed 40074 1727204644.96045: getting the remaining hosts for this loop 40074 1727204644.96047: done getting the remaining hosts for this loop 40074 1727204644.96051: getting the next task for host managed-node2 40074 1727204644.96058: done getting next task for host managed-node2 40074 1727204644.96063: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 40074 1727204644.96067: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204644.96090: getting variables 40074 1727204644.96092: in VariableManager get_vars() 40074 1727204644.96135: Calling all_inventory to load vars for managed-node2 40074 1727204644.96139: Calling groups_inventory to load vars for managed-node2 40074 1727204644.96141: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204644.96151: Calling all_plugins_play to load vars for managed-node2 40074 1727204644.96154: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204644.96158: Calling groups_plugins_play to load vars for managed-node2 40074 1727204644.97368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204644.98970: done with get_vars() 40074 1727204644.98994: done getting variables 40074 1727204644.99046: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 15:04:04 -0400 (0:00:00.045) 0:00:38.752 ***** 40074 1727204644.99073: entering _queue_task() for managed-node2/package 40074 1727204644.99324: worker is 1 (out of 1 available) 40074 1727204644.99339: exiting _queue_task() for managed-node2/package 40074 1727204644.99351: done queuing things up, now waiting for results queue to drain 40074 1727204644.99353: waiting for pending results... 40074 1727204644.99539: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 40074 1727204644.99643: in run() - task 12b410aa-8751-9fd7-2501-00000000064b 40074 1727204644.99655: variable 'ansible_search_path' from source: unknown 40074 1727204644.99658: variable 'ansible_search_path' from source: unknown 40074 1727204644.99694: calling self._execute() 40074 1727204644.99787: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204644.99799: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204644.99807: variable 'omit' from source: magic vars 40074 1727204645.00135: variable 'ansible_distribution_major_version' from source: facts 40074 1727204645.00146: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204645.00252: variable 'network_state' from source: role '' defaults 40074 1727204645.00265: Evaluated conditional (network_state != {}): False 40074 1727204645.00268: when evaluation is False, skipping this task 40074 1727204645.00271: _execute() done 40074 1727204645.00277: dumping result to json 40074 1727204645.00280: done dumping result, returning 40074 1727204645.00290: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-9fd7-2501-00000000064b] 40074 1727204645.00295: sending task result for task 12b410aa-8751-9fd7-2501-00000000064b 40074 1727204645.00396: done sending task result for task 12b410aa-8751-9fd7-2501-00000000064b 40074 1727204645.00399: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204645.00453: no more pending results, returning what we have 40074 1727204645.00458: results queue empty 40074 1727204645.00459: checking for any_errors_fatal 40074 1727204645.00465: done checking for any_errors_fatal 40074 1727204645.00466: checking for max_fail_percentage 40074 1727204645.00467: done checking for max_fail_percentage 40074 1727204645.00468: checking to see if all hosts have failed and the running result is not ok 40074 1727204645.00469: done checking to see if all hosts have failed 40074 1727204645.00470: getting the remaining hosts for this loop 40074 1727204645.00472: done getting the remaining hosts for this loop 40074 1727204645.00476: getting the next task for host managed-node2 40074 1727204645.00484: done getting next task for host managed-node2 40074 1727204645.00487: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 40074 1727204645.00493: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204645.00512: getting variables 40074 1727204645.00514: in VariableManager get_vars() 40074 1727204645.00554: Calling all_inventory to load vars for managed-node2 40074 1727204645.00557: Calling groups_inventory to load vars for managed-node2 40074 1727204645.00559: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204645.00569: Calling all_plugins_play to load vars for managed-node2 40074 1727204645.00572: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204645.00576: Calling groups_plugins_play to load vars for managed-node2 40074 1727204645.01904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204645.03507: done with get_vars() 40074 1727204645.03538: done getting variables 40074 1727204645.03594: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.045) 0:00:38.798 ***** 40074 1727204645.03626: entering _queue_task() for managed-node2/service 40074 1727204645.03905: worker is 1 (out of 1 available) 40074 1727204645.03922: exiting _queue_task() for managed-node2/service 40074 1727204645.03935: done queuing things up, now waiting for results queue to drain 40074 1727204645.03937: waiting for pending results... 40074 1727204645.04140: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 40074 1727204645.04253: in run() - task 12b410aa-8751-9fd7-2501-00000000064c 40074 1727204645.04266: variable 'ansible_search_path' from source: unknown 40074 1727204645.04271: variable 'ansible_search_path' from source: unknown 40074 1727204645.04307: calling self._execute() 40074 1727204645.04400: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204645.04407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204645.04420: variable 'omit' from source: magic vars 40074 1727204645.04743: variable 'ansible_distribution_major_version' from source: facts 40074 1727204645.04755: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204645.04860: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204645.05039: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204645.06823: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204645.06874: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204645.06910: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204645.06945: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204645.06968: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204645.07044: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204645.07077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204645.07101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.07139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204645.07152: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204645.07193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204645.07213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204645.07237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.07272: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204645.07284: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204645.07323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204645.07345: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204645.07366: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.07399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204645.07411: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204645.07563: variable 'network_connections' from source: include params 40074 1727204645.07573: variable 'interface0' from source: play vars 40074 1727204645.07639: variable 'interface0' from source: play vars 40074 1727204645.07651: variable 'interface1' from source: play vars 40074 1727204645.07708: variable 'interface1' from source: play vars 40074 1727204645.07767: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204645.07904: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204645.07937: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204645.07965: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204645.07992: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204645.08032: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204645.08050: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204645.08071: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.08093: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204645.08143: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204645.08346: variable 'network_connections' from source: include params 40074 1727204645.08352: variable 'interface0' from source: play vars 40074 1727204645.08405: variable 'interface0' from source: play vars 40074 1727204645.08413: variable 'interface1' from source: play vars 40074 1727204645.08467: variable 'interface1' from source: play vars 40074 1727204645.08490: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 40074 1727204645.08494: when evaluation is False, skipping this task 40074 1727204645.08497: _execute() done 40074 1727204645.08502: dumping result to json 40074 1727204645.08506: done dumping result, returning 40074 1727204645.08515: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-9fd7-2501-00000000064c] 40074 1727204645.08528: sending task result for task 12b410aa-8751-9fd7-2501-00000000064c 40074 1727204645.08623: done sending task result for task 12b410aa-8751-9fd7-2501-00000000064c 40074 1727204645.08626: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 40074 1727204645.08698: no more pending results, returning what we have 40074 1727204645.08702: results queue empty 40074 1727204645.08704: checking for any_errors_fatal 40074 1727204645.08711: done checking for any_errors_fatal 40074 1727204645.08712: checking for max_fail_percentage 40074 1727204645.08713: done checking for max_fail_percentage 40074 1727204645.08714: checking to see if all hosts have failed and the running result is not ok 40074 1727204645.08715: done checking to see if all hosts have failed 40074 1727204645.08716: getting the remaining hosts for this loop 40074 1727204645.08720: done getting the remaining hosts for this loop 40074 1727204645.08724: getting the next task for host managed-node2 40074 1727204645.08732: done getting next task for host managed-node2 40074 1727204645.08738: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 40074 1727204645.08743: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204645.08765: getting variables 40074 1727204645.08767: in VariableManager get_vars() 40074 1727204645.08813: Calling all_inventory to load vars for managed-node2 40074 1727204645.08816: Calling groups_inventory to load vars for managed-node2 40074 1727204645.08821: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204645.08831: Calling all_plugins_play to load vars for managed-node2 40074 1727204645.08834: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204645.08838: Calling groups_plugins_play to load vars for managed-node2 40074 1727204645.10105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204645.15682: done with get_vars() 40074 1727204645.15709: done getting variables 40074 1727204645.15758: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.121) 0:00:38.919 ***** 40074 1727204645.15780: entering _queue_task() for managed-node2/service 40074 1727204645.16065: worker is 1 (out of 1 available) 40074 1727204645.16079: exiting _queue_task() for managed-node2/service 40074 1727204645.16094: done queuing things up, now waiting for results queue to drain 40074 1727204645.16096: waiting for pending results... 40074 1727204645.16298: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 40074 1727204645.16416: in run() - task 12b410aa-8751-9fd7-2501-00000000064d 40074 1727204645.16433: variable 'ansible_search_path' from source: unknown 40074 1727204645.16437: variable 'ansible_search_path' from source: unknown 40074 1727204645.16470: calling self._execute() 40074 1727204645.16565: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204645.16573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204645.16583: variable 'omit' from source: magic vars 40074 1727204645.16917: variable 'ansible_distribution_major_version' from source: facts 40074 1727204645.16932: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204645.17072: variable 'network_provider' from source: set_fact 40074 1727204645.17079: variable 'network_state' from source: role '' defaults 40074 1727204645.17093: Evaluated conditional (network_provider == "nm" or network_state != {}): True 40074 1727204645.17098: variable 'omit' from source: magic vars 40074 1727204645.17153: variable 'omit' from source: magic vars 40074 1727204645.17177: variable 'network_service_name' from source: role '' defaults 40074 1727204645.17247: variable 'network_service_name' from source: role '' defaults 40074 1727204645.17340: variable '__network_provider_setup' from source: role '' defaults 40074 1727204645.17347: variable '__network_service_name_default_nm' from source: role '' defaults 40074 1727204645.17401: variable '__network_service_name_default_nm' from source: role '' defaults 40074 1727204645.17408: variable '__network_packages_default_nm' from source: role '' defaults 40074 1727204645.17465: variable '__network_packages_default_nm' from source: role '' defaults 40074 1727204645.17664: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204645.19393: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204645.19452: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204645.19484: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204645.19519: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204645.19544: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204645.19614: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204645.19643: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204645.19664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.19698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204645.19714: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204645.19757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204645.19777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204645.19799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.19836: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204645.19849: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204645.20042: variable '__network_packages_default_gobject_packages' from source: role '' defaults 40074 1727204645.20140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204645.20161: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204645.20185: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.20218: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204645.20232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204645.20308: variable 'ansible_python' from source: facts 40074 1727204645.20330: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 40074 1727204645.20399: variable '__network_wpa_supplicant_required' from source: role '' defaults 40074 1727204645.20464: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 40074 1727204645.20571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204645.20593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204645.20618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.20650: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204645.20663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204645.20705: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204645.20734: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204645.20755: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.20785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204645.20801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204645.20916: variable 'network_connections' from source: include params 40074 1727204645.20926: variable 'interface0' from source: play vars 40074 1727204645.20990: variable 'interface0' from source: play vars 40074 1727204645.21003: variable 'interface1' from source: play vars 40074 1727204645.21067: variable 'interface1' from source: play vars 40074 1727204645.21155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204645.21312: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204645.21356: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204645.21396: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204645.21432: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204645.21480: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204645.21510: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204645.21539: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.21568: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204645.21614: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204645.21842: variable 'network_connections' from source: include params 40074 1727204645.21849: variable 'interface0' from source: play vars 40074 1727204645.21910: variable 'interface0' from source: play vars 40074 1727204645.21928: variable 'interface1' from source: play vars 40074 1727204645.21984: variable 'interface1' from source: play vars 40074 1727204645.22015: variable '__network_packages_default_wireless' from source: role '' defaults 40074 1727204645.22083: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204645.22327: variable 'network_connections' from source: include params 40074 1727204645.22333: variable 'interface0' from source: play vars 40074 1727204645.22394: variable 'interface0' from source: play vars 40074 1727204645.22403: variable 'interface1' from source: play vars 40074 1727204645.22464: variable 'interface1' from source: play vars 40074 1727204645.22485: variable '__network_packages_default_team' from source: role '' defaults 40074 1727204645.22553: variable '__network_team_connections_defined' from source: role '' defaults 40074 1727204645.22793: variable 'network_connections' from source: include params 40074 1727204645.22797: variable 'interface0' from source: play vars 40074 1727204645.22858: variable 'interface0' from source: play vars 40074 1727204645.22866: variable 'interface1' from source: play vars 40074 1727204645.22930: variable 'interface1' from source: play vars 40074 1727204645.22975: variable '__network_service_name_default_initscripts' from source: role '' defaults 40074 1727204645.23028: variable '__network_service_name_default_initscripts' from source: role '' defaults 40074 1727204645.23035: variable '__network_packages_default_initscripts' from source: role '' defaults 40074 1727204645.23084: variable '__network_packages_default_initscripts' from source: role '' defaults 40074 1727204645.23280: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 40074 1727204645.23698: variable 'network_connections' from source: include params 40074 1727204645.23702: variable 'interface0' from source: play vars 40074 1727204645.23754: variable 'interface0' from source: play vars 40074 1727204645.23761: variable 'interface1' from source: play vars 40074 1727204645.23815: variable 'interface1' from source: play vars 40074 1727204645.23824: variable 'ansible_distribution' from source: facts 40074 1727204645.23828: variable '__network_rh_distros' from source: role '' defaults 40074 1727204645.23835: variable 'ansible_distribution_major_version' from source: facts 40074 1727204645.23849: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 40074 1727204645.23998: variable 'ansible_distribution' from source: facts 40074 1727204645.24003: variable '__network_rh_distros' from source: role '' defaults 40074 1727204645.24013: variable 'ansible_distribution_major_version' from source: facts 40074 1727204645.24016: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 40074 1727204645.24158: variable 'ansible_distribution' from source: facts 40074 1727204645.24162: variable '__network_rh_distros' from source: role '' defaults 40074 1727204645.24168: variable 'ansible_distribution_major_version' from source: facts 40074 1727204645.24201: variable 'network_provider' from source: set_fact 40074 1727204645.24224: variable 'omit' from source: magic vars 40074 1727204645.24250: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204645.24274: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204645.24291: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204645.24309: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204645.24322: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204645.24351: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204645.24354: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204645.24359: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204645.24447: Set connection var ansible_pipelining to False 40074 1727204645.24450: Set connection var ansible_shell_executable to /bin/sh 40074 1727204645.24460: Set connection var ansible_shell_type to sh 40074 1727204645.24462: Set connection var ansible_connection to ssh 40074 1727204645.24465: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204645.24472: Set connection var ansible_timeout to 10 40074 1727204645.24498: variable 'ansible_shell_executable' from source: unknown 40074 1727204645.24501: variable 'ansible_connection' from source: unknown 40074 1727204645.24504: variable 'ansible_module_compression' from source: unknown 40074 1727204645.24510: variable 'ansible_shell_type' from source: unknown 40074 1727204645.24512: variable 'ansible_shell_executable' from source: unknown 40074 1727204645.24522: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204645.24524: variable 'ansible_pipelining' from source: unknown 40074 1727204645.24527: variable 'ansible_timeout' from source: unknown 40074 1727204645.24529: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204645.24615: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204645.24627: variable 'omit' from source: magic vars 40074 1727204645.24634: starting attempt loop 40074 1727204645.24637: running the handler 40074 1727204645.24705: variable 'ansible_facts' from source: unknown 40074 1727204645.25439: _low_level_execute_command(): starting 40074 1727204645.25443: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204645.25986: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204645.25993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204645.25996: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204645.25998: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204645.26053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204645.26056: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204645.26116: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204645.27877: stdout chunk (state=3): >>>/root <<< 40074 1727204645.27993: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204645.28046: stderr chunk (state=3): >>><<< 40074 1727204645.28050: stdout chunk (state=3): >>><<< 40074 1727204645.28070: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204645.28082: _low_level_execute_command(): starting 40074 1727204645.28088: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913 `" && echo ansible-tmp-1727204645.280691-41854-240941732324913="` echo /root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913 `" ) && sleep 0' 40074 1727204645.28547: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204645.28550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204645.28553: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204645.28555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204645.28607: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204645.28616: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204645.28655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204645.30683: stdout chunk (state=3): >>>ansible-tmp-1727204645.280691-41854-240941732324913=/root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913 <<< 40074 1727204645.30802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204645.30855: stderr chunk (state=3): >>><<< 40074 1727204645.30858: stdout chunk (state=3): >>><<< 40074 1727204645.30875: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204645.280691-41854-240941732324913=/root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204645.30906: variable 'ansible_module_compression' from source: unknown 40074 1727204645.30954: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 40074 1727204645.31012: variable 'ansible_facts' from source: unknown 40074 1727204645.31159: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913/AnsiballZ_systemd.py 40074 1727204645.31285: Sending initial data 40074 1727204645.31292: Sent initial data (155 bytes) 40074 1727204645.31778: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204645.31782: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204645.31785: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204645.31790: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204645.31792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204645.31843: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204645.31846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204645.31893: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204645.33560: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 40074 1727204645.33567: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204645.33599: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204645.33641: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpvrmyp_gm /root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913/AnsiballZ_systemd.py <<< 40074 1727204645.33647: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913/AnsiballZ_systemd.py" <<< 40074 1727204645.33678: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpvrmyp_gm" to remote "/root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913/AnsiballZ_systemd.py" <<< 40074 1727204645.35362: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204645.35430: stderr chunk (state=3): >>><<< 40074 1727204645.35434: stdout chunk (state=3): >>><<< 40074 1727204645.35456: done transferring module to remote 40074 1727204645.35469: _low_level_execute_command(): starting 40074 1727204645.35473: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913/ /root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913/AnsiballZ_systemd.py && sleep 0' 40074 1727204645.35962: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204645.35966: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204645.35969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204645.35971: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204645.35973: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204645.36042: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204645.36049: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204645.36052: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204645.36078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204645.37996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204645.38046: stderr chunk (state=3): >>><<< 40074 1727204645.38049: stdout chunk (state=3): >>><<< 40074 1727204645.38062: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204645.38065: _low_level_execute_command(): starting 40074 1727204645.38073: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913/AnsiballZ_systemd.py && sleep 0' 40074 1727204645.38536: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204645.38539: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204645.38542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204645.38544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204645.38600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204645.38604: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204645.38652: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204645.71809: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4579328", "MemoryAvailable": "infinity", "CPUUsageNSec": "2443619000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target network.target network.service cloud-init.service NetworkManager-wait-online.service", "After": "systemd-journald.socket sysinit.target dbus.socket cloud-init-local.service system.slice network-pre.target basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:49 EDT", "StateChangeTimestampMonotonic": "1013574884", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 40074 1727204645.73783: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204645.73851: stderr chunk (state=3): >>><<< 40074 1727204645.73856: stdout chunk (state=3): >>><<< 40074 1727204645.73874: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4579328", "MemoryAvailable": "infinity", "CPUUsageNSec": "2443619000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "sysinit.target dbus.socket system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "shutdown.target multi-user.target network.target network.service cloud-init.service NetworkManager-wait-online.service", "After": "systemd-journald.socket sysinit.target dbus.socket cloud-init-local.service system.slice network-pre.target basic.target dbus-broker.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 15:01:49 EDT", "StateChangeTimestampMonotonic": "1013574884", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204645.74052: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204645.74070: _low_level_execute_command(): starting 40074 1727204645.74076: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204645.280691-41854-240941732324913/ > /dev/null 2>&1 && sleep 0' 40074 1727204645.74560: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204645.74564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204645.74566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204645.74570: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204645.74627: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204645.74633: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204645.74673: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204645.76630: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204645.76685: stderr chunk (state=3): >>><<< 40074 1727204645.76691: stdout chunk (state=3): >>><<< 40074 1727204645.76705: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204645.76714: handler run complete 40074 1727204645.76773: attempt loop complete, returning result 40074 1727204645.76776: _execute() done 40074 1727204645.76779: dumping result to json 40074 1727204645.76796: done dumping result, returning 40074 1727204645.76805: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-9fd7-2501-00000000064d] 40074 1727204645.76811: sending task result for task 12b410aa-8751-9fd7-2501-00000000064d 40074 1727204645.77094: done sending task result for task 12b410aa-8751-9fd7-2501-00000000064d 40074 1727204645.77097: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204645.77163: no more pending results, returning what we have 40074 1727204645.77166: results queue empty 40074 1727204645.77167: checking for any_errors_fatal 40074 1727204645.77174: done checking for any_errors_fatal 40074 1727204645.77174: checking for max_fail_percentage 40074 1727204645.77176: done checking for max_fail_percentage 40074 1727204645.77177: checking to see if all hosts have failed and the running result is not ok 40074 1727204645.77178: done checking to see if all hosts have failed 40074 1727204645.77179: getting the remaining hosts for this loop 40074 1727204645.77180: done getting the remaining hosts for this loop 40074 1727204645.77185: getting the next task for host managed-node2 40074 1727204645.77198: done getting next task for host managed-node2 40074 1727204645.77202: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 40074 1727204645.77211: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204645.77227: getting variables 40074 1727204645.77229: in VariableManager get_vars() 40074 1727204645.77267: Calling all_inventory to load vars for managed-node2 40074 1727204645.77270: Calling groups_inventory to load vars for managed-node2 40074 1727204645.77272: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204645.77282: Calling all_plugins_play to load vars for managed-node2 40074 1727204645.77284: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204645.77287: Calling groups_plugins_play to load vars for managed-node2 40074 1727204645.78560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204645.80187: done with get_vars() 40074 1727204645.80215: done getting variables 40074 1727204645.80269: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.645) 0:00:39.564 ***** 40074 1727204645.80299: entering _queue_task() for managed-node2/service 40074 1727204645.80573: worker is 1 (out of 1 available) 40074 1727204645.80590: exiting _queue_task() for managed-node2/service 40074 1727204645.80604: done queuing things up, now waiting for results queue to drain 40074 1727204645.80606: waiting for pending results... 40074 1727204645.80829: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 40074 1727204645.80947: in run() - task 12b410aa-8751-9fd7-2501-00000000064e 40074 1727204645.80960: variable 'ansible_search_path' from source: unknown 40074 1727204645.80963: variable 'ansible_search_path' from source: unknown 40074 1727204645.80999: calling self._execute() 40074 1727204645.81094: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204645.81102: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204645.81112: variable 'omit' from source: magic vars 40074 1727204645.81449: variable 'ansible_distribution_major_version' from source: facts 40074 1727204645.81460: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204645.81566: variable 'network_provider' from source: set_fact 40074 1727204645.81571: Evaluated conditional (network_provider == "nm"): True 40074 1727204645.81654: variable '__network_wpa_supplicant_required' from source: role '' defaults 40074 1727204645.81731: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 40074 1727204645.81883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204645.83893: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204645.83952: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204645.83986: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204645.84017: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204645.84046: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204645.84115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204645.84147: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204645.84168: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.84202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204645.84214: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204645.84261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204645.84280: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204645.84304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.84342: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204645.84355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204645.84393: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204645.84413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204645.84436: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.84471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204645.84483: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204645.84602: variable 'network_connections' from source: include params 40074 1727204645.84613: variable 'interface0' from source: play vars 40074 1727204645.84677: variable 'interface0' from source: play vars 40074 1727204645.84687: variable 'interface1' from source: play vars 40074 1727204645.84742: variable 'interface1' from source: play vars 40074 1727204645.84808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 40074 1727204645.84954: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 40074 1727204645.84985: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 40074 1727204645.85017: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 40074 1727204645.85043: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 40074 1727204645.85078: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 40074 1727204645.85099: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 40074 1727204645.85127: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204645.85149: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 40074 1727204645.85194: variable '__network_wireless_connections_defined' from source: role '' defaults 40074 1727204645.85401: variable 'network_connections' from source: include params 40074 1727204645.85407: variable 'interface0' from source: play vars 40074 1727204645.85462: variable 'interface0' from source: play vars 40074 1727204645.85470: variable 'interface1' from source: play vars 40074 1727204645.85523: variable 'interface1' from source: play vars 40074 1727204645.85554: Evaluated conditional (__network_wpa_supplicant_required): False 40074 1727204645.85559: when evaluation is False, skipping this task 40074 1727204645.85570: _execute() done 40074 1727204645.85573: dumping result to json 40074 1727204645.85576: done dumping result, returning 40074 1727204645.85579: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-9fd7-2501-00000000064e] 40074 1727204645.85581: sending task result for task 12b410aa-8751-9fd7-2501-00000000064e 40074 1727204645.85678: done sending task result for task 12b410aa-8751-9fd7-2501-00000000064e 40074 1727204645.85681: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 40074 1727204645.85741: no more pending results, returning what we have 40074 1727204645.85744: results queue empty 40074 1727204645.85745: checking for any_errors_fatal 40074 1727204645.85765: done checking for any_errors_fatal 40074 1727204645.85766: checking for max_fail_percentage 40074 1727204645.85768: done checking for max_fail_percentage 40074 1727204645.85769: checking to see if all hosts have failed and the running result is not ok 40074 1727204645.85770: done checking to see if all hosts have failed 40074 1727204645.85771: getting the remaining hosts for this loop 40074 1727204645.85772: done getting the remaining hosts for this loop 40074 1727204645.85777: getting the next task for host managed-node2 40074 1727204645.85785: done getting next task for host managed-node2 40074 1727204645.85797: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 40074 1727204645.85801: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204645.85824: getting variables 40074 1727204645.85826: in VariableManager get_vars() 40074 1727204645.85870: Calling all_inventory to load vars for managed-node2 40074 1727204645.85873: Calling groups_inventory to load vars for managed-node2 40074 1727204645.85876: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204645.85886: Calling all_plugins_play to load vars for managed-node2 40074 1727204645.85893: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204645.85897: Calling groups_plugins_play to load vars for managed-node2 40074 1727204645.87274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204645.88873: done with get_vars() 40074 1727204645.88897: done getting variables 40074 1727204645.88948: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.086) 0:00:39.651 ***** 40074 1727204645.88974: entering _queue_task() for managed-node2/service 40074 1727204645.89235: worker is 1 (out of 1 available) 40074 1727204645.89251: exiting _queue_task() for managed-node2/service 40074 1727204645.89266: done queuing things up, now waiting for results queue to drain 40074 1727204645.89267: waiting for pending results... 40074 1727204645.89474: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 40074 1727204645.89590: in run() - task 12b410aa-8751-9fd7-2501-00000000064f 40074 1727204645.89605: variable 'ansible_search_path' from source: unknown 40074 1727204645.89609: variable 'ansible_search_path' from source: unknown 40074 1727204645.89644: calling self._execute() 40074 1727204645.89741: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204645.89749: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204645.89760: variable 'omit' from source: magic vars 40074 1727204645.90088: variable 'ansible_distribution_major_version' from source: facts 40074 1727204645.90100: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204645.90198: variable 'network_provider' from source: set_fact 40074 1727204645.90204: Evaluated conditional (network_provider == "initscripts"): False 40074 1727204645.90207: when evaluation is False, skipping this task 40074 1727204645.90213: _execute() done 40074 1727204645.90220: dumping result to json 40074 1727204645.90223: done dumping result, returning 40074 1727204645.90230: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-9fd7-2501-00000000064f] 40074 1727204645.90235: sending task result for task 12b410aa-8751-9fd7-2501-00000000064f skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 40074 1727204645.90382: no more pending results, returning what we have 40074 1727204645.90386: results queue empty 40074 1727204645.90387: checking for any_errors_fatal 40074 1727204645.90399: done checking for any_errors_fatal 40074 1727204645.90400: checking for max_fail_percentage 40074 1727204645.90403: done checking for max_fail_percentage 40074 1727204645.90404: checking to see if all hosts have failed and the running result is not ok 40074 1727204645.90405: done checking to see if all hosts have failed 40074 1727204645.90406: getting the remaining hosts for this loop 40074 1727204645.90408: done getting the remaining hosts for this loop 40074 1727204645.90412: getting the next task for host managed-node2 40074 1727204645.90423: done getting next task for host managed-node2 40074 1727204645.90427: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 40074 1727204645.90431: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204645.90451: getting variables 40074 1727204645.90453: in VariableManager get_vars() 40074 1727204645.90502: Calling all_inventory to load vars for managed-node2 40074 1727204645.90505: Calling groups_inventory to load vars for managed-node2 40074 1727204645.90508: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204645.90514: done sending task result for task 12b410aa-8751-9fd7-2501-00000000064f 40074 1727204645.90519: WORKER PROCESS EXITING 40074 1727204645.90528: Calling all_plugins_play to load vars for managed-node2 40074 1727204645.90532: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204645.90536: Calling groups_plugins_play to load vars for managed-node2 40074 1727204645.91843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204645.93460: done with get_vars() 40074 1727204645.93482: done getting variables 40074 1727204645.93534: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.045) 0:00:39.697 ***** 40074 1727204645.93563: entering _queue_task() for managed-node2/copy 40074 1727204645.93812: worker is 1 (out of 1 available) 40074 1727204645.93831: exiting _queue_task() for managed-node2/copy 40074 1727204645.93845: done queuing things up, now waiting for results queue to drain 40074 1727204645.93846: waiting for pending results... 40074 1727204645.94046: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 40074 1727204645.94155: in run() - task 12b410aa-8751-9fd7-2501-000000000650 40074 1727204645.94168: variable 'ansible_search_path' from source: unknown 40074 1727204645.94172: variable 'ansible_search_path' from source: unknown 40074 1727204645.94209: calling self._execute() 40074 1727204645.94304: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204645.94308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204645.94323: variable 'omit' from source: magic vars 40074 1727204645.94643: variable 'ansible_distribution_major_version' from source: facts 40074 1727204645.94653: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204645.94754: variable 'network_provider' from source: set_fact 40074 1727204645.94760: Evaluated conditional (network_provider == "initscripts"): False 40074 1727204645.94763: when evaluation is False, skipping this task 40074 1727204645.94768: _execute() done 40074 1727204645.94773: dumping result to json 40074 1727204645.94778: done dumping result, returning 40074 1727204645.94787: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-9fd7-2501-000000000650] 40074 1727204645.94794: sending task result for task 12b410aa-8751-9fd7-2501-000000000650 40074 1727204645.94893: done sending task result for task 12b410aa-8751-9fd7-2501-000000000650 40074 1727204645.94897: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 40074 1727204645.94949: no more pending results, returning what we have 40074 1727204645.94953: results queue empty 40074 1727204645.94954: checking for any_errors_fatal 40074 1727204645.94960: done checking for any_errors_fatal 40074 1727204645.94961: checking for max_fail_percentage 40074 1727204645.94963: done checking for max_fail_percentage 40074 1727204645.94964: checking to see if all hosts have failed and the running result is not ok 40074 1727204645.94965: done checking to see if all hosts have failed 40074 1727204645.94966: getting the remaining hosts for this loop 40074 1727204645.94967: done getting the remaining hosts for this loop 40074 1727204645.94972: getting the next task for host managed-node2 40074 1727204645.94979: done getting next task for host managed-node2 40074 1727204645.94983: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 40074 1727204645.94987: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204645.95008: getting variables 40074 1727204645.95010: in VariableManager get_vars() 40074 1727204645.95051: Calling all_inventory to load vars for managed-node2 40074 1727204645.95054: Calling groups_inventory to load vars for managed-node2 40074 1727204645.95056: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204645.95066: Calling all_plugins_play to load vars for managed-node2 40074 1727204645.95069: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204645.95073: Calling groups_plugins_play to load vars for managed-node2 40074 1727204645.96302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204645.97923: done with get_vars() 40074 1727204645.97946: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 15:04:05 -0400 (0:00:00.044) 0:00:39.741 ***** 40074 1727204645.98021: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 40074 1727204645.98278: worker is 1 (out of 1 available) 40074 1727204645.98297: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 40074 1727204645.98311: done queuing things up, now waiting for results queue to drain 40074 1727204645.98312: waiting for pending results... 40074 1727204645.98507: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 40074 1727204645.98613: in run() - task 12b410aa-8751-9fd7-2501-000000000651 40074 1727204645.98627: variable 'ansible_search_path' from source: unknown 40074 1727204645.98630: variable 'ansible_search_path' from source: unknown 40074 1727204645.98666: calling self._execute() 40074 1727204645.98757: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204645.98761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204645.98777: variable 'omit' from source: magic vars 40074 1727204645.99100: variable 'ansible_distribution_major_version' from source: facts 40074 1727204645.99111: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204645.99121: variable 'omit' from source: magic vars 40074 1727204645.99168: variable 'omit' from source: magic vars 40074 1727204645.99306: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 40074 1727204646.01283: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 40074 1727204646.01337: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 40074 1727204646.01368: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 40074 1727204646.01399: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 40074 1727204646.01428: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 40074 1727204646.01490: variable 'network_provider' from source: set_fact 40074 1727204646.01607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 40074 1727204646.01636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 40074 1727204646.01658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 40074 1727204646.01691: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 40074 1727204646.01704: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 40074 1727204646.01768: variable 'omit' from source: magic vars 40074 1727204646.01865: variable 'omit' from source: magic vars 40074 1727204646.01957: variable 'network_connections' from source: include params 40074 1727204646.01965: variable 'interface0' from source: play vars 40074 1727204646.02054: variable 'interface0' from source: play vars 40074 1727204646.02060: variable 'interface1' from source: play vars 40074 1727204646.02083: variable 'interface1' from source: play vars 40074 1727204646.02212: variable 'omit' from source: magic vars 40074 1727204646.02222: variable '__lsr_ansible_managed' from source: task vars 40074 1727204646.02273: variable '__lsr_ansible_managed' from source: task vars 40074 1727204646.02509: Loaded config def from plugin (lookup/template) 40074 1727204646.02515: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 40074 1727204646.02540: File lookup term: get_ansible_managed.j2 40074 1727204646.02544: variable 'ansible_search_path' from source: unknown 40074 1727204646.02550: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 40074 1727204646.02563: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 40074 1727204646.02577: variable 'ansible_search_path' from source: unknown 40074 1727204646.08184: variable 'ansible_managed' from source: unknown 40074 1727204646.08324: variable 'omit' from source: magic vars 40074 1727204646.08350: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204646.08377: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204646.08394: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204646.08410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204646.08471: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204646.08475: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204646.08480: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204646.08483: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204646.08538: Set connection var ansible_pipelining to False 40074 1727204646.08543: Set connection var ansible_shell_executable to /bin/sh 40074 1727204646.08546: Set connection var ansible_shell_type to sh 40074 1727204646.08550: Set connection var ansible_connection to ssh 40074 1727204646.08557: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204646.08564: Set connection var ansible_timeout to 10 40074 1727204646.08588: variable 'ansible_shell_executable' from source: unknown 40074 1727204646.08594: variable 'ansible_connection' from source: unknown 40074 1727204646.08597: variable 'ansible_module_compression' from source: unknown 40074 1727204646.08603: variable 'ansible_shell_type' from source: unknown 40074 1727204646.08606: variable 'ansible_shell_executable' from source: unknown 40074 1727204646.08609: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204646.08613: variable 'ansible_pipelining' from source: unknown 40074 1727204646.08620: variable 'ansible_timeout' from source: unknown 40074 1727204646.08622: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204646.08735: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204646.08747: variable 'omit' from source: magic vars 40074 1727204646.08754: starting attempt loop 40074 1727204646.08756: running the handler 40074 1727204646.08769: _low_level_execute_command(): starting 40074 1727204646.08775: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204646.09319: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204646.09323: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204646.09325: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204646.09328: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204646.09386: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204646.09394: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204646.09440: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204646.11220: stdout chunk (state=3): >>>/root <<< 40074 1727204646.11330: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204646.11380: stderr chunk (state=3): >>><<< 40074 1727204646.11384: stdout chunk (state=3): >>><<< 40074 1727204646.11407: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204646.11419: _low_level_execute_command(): starting 40074 1727204646.11427: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060 `" && echo ansible-tmp-1727204646.1140518-41872-260201763728060="` echo /root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060 `" ) && sleep 0' 40074 1727204646.11882: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204646.11885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204646.11888: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204646.11892: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204646.11942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204646.11950: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204646.11991: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204646.16899: stdout chunk (state=3): >>>ansible-tmp-1727204646.1140518-41872-260201763728060=/root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060 <<< 40074 1727204646.17024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204646.17087: stderr chunk (state=3): >>><<< 40074 1727204646.17093: stdout chunk (state=3): >>><<< 40074 1727204646.17111: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204646.1140518-41872-260201763728060=/root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204646.17161: variable 'ansible_module_compression' from source: unknown 40074 1727204646.17204: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 40074 1727204646.17249: variable 'ansible_facts' from source: unknown 40074 1727204646.17343: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060/AnsiballZ_network_connections.py 40074 1727204646.17465: Sending initial data 40074 1727204646.17469: Sent initial data (168 bytes) 40074 1727204646.17953: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204646.17958: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204646.17964: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204646.17967: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204646.18021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204646.18032: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204646.18064: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204646.19747: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 40074 1727204646.19751: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204646.19781: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204646.19822: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpunwuenwn /root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060/AnsiballZ_network_connections.py <<< 40074 1727204646.19825: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060/AnsiballZ_network_connections.py" <<< 40074 1727204646.19856: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpunwuenwn" to remote "/root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060/AnsiballZ_network_connections.py" <<< 40074 1727204646.20957: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204646.21023: stderr chunk (state=3): >>><<< 40074 1727204646.21027: stdout chunk (state=3): >>><<< 40074 1727204646.21047: done transferring module to remote 40074 1727204646.21060: _low_level_execute_command(): starting 40074 1727204646.21063: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060/ /root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060/AnsiballZ_network_connections.py && sleep 0' 40074 1727204646.21539: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204646.21542: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204646.21547: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204646.21550: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204646.21553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204646.21603: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204646.21608: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204646.21647: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204646.23530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204646.23584: stderr chunk (state=3): >>><<< 40074 1727204646.23588: stdout chunk (state=3): >>><<< 40074 1727204646.23609: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204646.23612: _low_level_execute_command(): starting 40074 1727204646.23618: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060/AnsiballZ_network_connections.py && sleep 0' 40074 1727204646.24081: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204646.24085: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204646.24087: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204646.24091: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204646.24094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204646.24152: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204646.24155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204646.24196: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204646.72866: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ktu7akgl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 40074 1727204646.72900: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ktu7akgl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/00af1773-a047-48e6-9537-86cd5f38b3ec: error=unknown <<< 40074 1727204646.74751: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 40074 1727204646.74765: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ktu7akgl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ktu7akgl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail<<< 40074 1727204646.74841: stdout chunk (state=3): >>> ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/5f0b5761-22ac-495b-91d9-5e5de304bc07: error=unknown <<< 40074 1727204646.75035: stdout chunk (state=3): >>> <<< 40074 1727204646.75049: stdout chunk (state=3): >>>{"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 40074 1727204646.77125: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204646.77129: stdout chunk (state=3): >>><<< 40074 1727204646.77131: stderr chunk (state=3): >>><<< 40074 1727204646.77296: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ktu7akgl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ktu7akgl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest0/00af1773-a047-48e6-9537-86cd5f38b3ec: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ktu7akgl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_ktu7akgl/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on ethtest1/5f0b5761-22ac-495b-91d9-5e5de304bc07: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "ethtest0", "persistent_state": "absent", "state": "down"}, {"name": "ethtest1", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204646.77299: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'ethtest0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'ethtest1', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204646.77302: _low_level_execute_command(): starting 40074 1727204646.77305: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204646.1140518-41872-260201763728060/ > /dev/null 2>&1 && sleep 0' 40074 1727204646.77945: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204646.77961: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204646.78006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204646.78030: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204646.78143: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204646.78212: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204646.78238: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204646.78261: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204646.78339: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204646.80365: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204646.80378: stderr chunk (state=3): >>><<< 40074 1727204646.80381: stdout chunk (state=3): >>><<< 40074 1727204646.80401: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204646.80408: handler run complete 40074 1727204646.80436: attempt loop complete, returning result 40074 1727204646.80439: _execute() done 40074 1727204646.80442: dumping result to json 40074 1727204646.80449: done dumping result, returning 40074 1727204646.80462: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-9fd7-2501-000000000651] 40074 1727204646.80465: sending task result for task 12b410aa-8751-9fd7-2501-000000000651 40074 1727204646.80587: done sending task result for task 12b410aa-8751-9fd7-2501-000000000651 40074 1727204646.80591: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 40074 1727204646.80750: no more pending results, returning what we have 40074 1727204646.80755: results queue empty 40074 1727204646.80756: checking for any_errors_fatal 40074 1727204646.80762: done checking for any_errors_fatal 40074 1727204646.80763: checking for max_fail_percentage 40074 1727204646.80765: done checking for max_fail_percentage 40074 1727204646.80766: checking to see if all hosts have failed and the running result is not ok 40074 1727204646.80767: done checking to see if all hosts have failed 40074 1727204646.80768: getting the remaining hosts for this loop 40074 1727204646.80769: done getting the remaining hosts for this loop 40074 1727204646.80773: getting the next task for host managed-node2 40074 1727204646.80781: done getting next task for host managed-node2 40074 1727204646.80785: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 40074 1727204646.80800: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204646.80814: getting variables 40074 1727204646.80816: in VariableManager get_vars() 40074 1727204646.80862: Calling all_inventory to load vars for managed-node2 40074 1727204646.80865: Calling groups_inventory to load vars for managed-node2 40074 1727204646.80867: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204646.80877: Calling all_plugins_play to load vars for managed-node2 40074 1727204646.80880: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204646.80883: Calling groups_plugins_play to load vars for managed-node2 40074 1727204646.83412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204646.85520: done with get_vars() 40074 1727204646.85550: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 15:04:06 -0400 (0:00:00.876) 0:00:40.618 ***** 40074 1727204646.85632: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 40074 1727204646.85920: worker is 1 (out of 1 available) 40074 1727204646.85937: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 40074 1727204646.85952: done queuing things up, now waiting for results queue to drain 40074 1727204646.85953: waiting for pending results... 40074 1727204646.86154: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 40074 1727204646.86258: in run() - task 12b410aa-8751-9fd7-2501-000000000652 40074 1727204646.86271: variable 'ansible_search_path' from source: unknown 40074 1727204646.86274: variable 'ansible_search_path' from source: unknown 40074 1727204646.86340: calling self._execute() 40074 1727204646.86494: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204646.86499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204646.86502: variable 'omit' from source: magic vars 40074 1727204646.86910: variable 'ansible_distribution_major_version' from source: facts 40074 1727204646.86929: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204646.87084: variable 'network_state' from source: role '' defaults 40074 1727204646.87115: Evaluated conditional (network_state != {}): False 40074 1727204646.87123: when evaluation is False, skipping this task 40074 1727204646.87132: _execute() done 40074 1727204646.87141: dumping result to json 40074 1727204646.87150: done dumping result, returning 40074 1727204646.87162: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-9fd7-2501-000000000652] 40074 1727204646.87176: sending task result for task 12b410aa-8751-9fd7-2501-000000000652 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 40074 1727204646.87352: no more pending results, returning what we have 40074 1727204646.87357: results queue empty 40074 1727204646.87358: checking for any_errors_fatal 40074 1727204646.87374: done checking for any_errors_fatal 40074 1727204646.87375: checking for max_fail_percentage 40074 1727204646.87377: done checking for max_fail_percentage 40074 1727204646.87379: checking to see if all hosts have failed and the running result is not ok 40074 1727204646.87380: done checking to see if all hosts have failed 40074 1727204646.87381: getting the remaining hosts for this loop 40074 1727204646.87382: done getting the remaining hosts for this loop 40074 1727204646.87387: getting the next task for host managed-node2 40074 1727204646.87398: done getting next task for host managed-node2 40074 1727204646.87404: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 40074 1727204646.87410: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204646.87439: getting variables 40074 1727204646.87442: in VariableManager get_vars() 40074 1727204646.87520: Calling all_inventory to load vars for managed-node2 40074 1727204646.87524: Calling groups_inventory to load vars for managed-node2 40074 1727204646.87527: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204646.87541: Calling all_plugins_play to load vars for managed-node2 40074 1727204646.87545: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204646.87549: Calling groups_plugins_play to load vars for managed-node2 40074 1727204646.88106: done sending task result for task 12b410aa-8751-9fd7-2501-000000000652 40074 1727204646.88110: WORKER PROCESS EXITING 40074 1727204646.89066: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204646.90693: done with get_vars() 40074 1727204646.90715: done getting variables 40074 1727204646.90771: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 15:04:06 -0400 (0:00:00.051) 0:00:40.669 ***** 40074 1727204646.90801: entering _queue_task() for managed-node2/debug 40074 1727204646.91071: worker is 1 (out of 1 available) 40074 1727204646.91086: exiting _queue_task() for managed-node2/debug 40074 1727204646.91102: done queuing things up, now waiting for results queue to drain 40074 1727204646.91104: waiting for pending results... 40074 1727204646.91300: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 40074 1727204646.91403: in run() - task 12b410aa-8751-9fd7-2501-000000000653 40074 1727204646.91420: variable 'ansible_search_path' from source: unknown 40074 1727204646.91424: variable 'ansible_search_path' from source: unknown 40074 1727204646.91458: calling self._execute() 40074 1727204646.91548: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204646.91553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204646.91567: variable 'omit' from source: magic vars 40074 1727204646.91895: variable 'ansible_distribution_major_version' from source: facts 40074 1727204646.91904: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204646.91910: variable 'omit' from source: magic vars 40074 1727204646.91963: variable 'omit' from source: magic vars 40074 1727204646.92001: variable 'omit' from source: magic vars 40074 1727204646.92038: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204646.92068: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204646.92088: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204646.92106: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204646.92125: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204646.92148: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204646.92152: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204646.92156: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204646.92246: Set connection var ansible_pipelining to False 40074 1727204646.92253: Set connection var ansible_shell_executable to /bin/sh 40074 1727204646.92256: Set connection var ansible_shell_type to sh 40074 1727204646.92259: Set connection var ansible_connection to ssh 40074 1727204646.92267: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204646.92273: Set connection var ansible_timeout to 10 40074 1727204646.92296: variable 'ansible_shell_executable' from source: unknown 40074 1727204646.92299: variable 'ansible_connection' from source: unknown 40074 1727204646.92302: variable 'ansible_module_compression' from source: unknown 40074 1727204646.92306: variable 'ansible_shell_type' from source: unknown 40074 1727204646.92309: variable 'ansible_shell_executable' from source: unknown 40074 1727204646.92314: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204646.92321: variable 'ansible_pipelining' from source: unknown 40074 1727204646.92324: variable 'ansible_timeout' from source: unknown 40074 1727204646.92326: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204646.92449: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204646.92460: variable 'omit' from source: magic vars 40074 1727204646.92467: starting attempt loop 40074 1727204646.92470: running the handler 40074 1727204646.92580: variable '__network_connections_result' from source: set_fact 40074 1727204646.92630: handler run complete 40074 1727204646.92645: attempt loop complete, returning result 40074 1727204646.92648: _execute() done 40074 1727204646.92651: dumping result to json 40074 1727204646.92658: done dumping result, returning 40074 1727204646.92667: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-9fd7-2501-000000000653] 40074 1727204646.92671: sending task result for task 12b410aa-8751-9fd7-2501-000000000653 40074 1727204646.92766: done sending task result for task 12b410aa-8751-9fd7-2501-000000000653 40074 1727204646.92772: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 40074 1727204646.92850: no more pending results, returning what we have 40074 1727204646.92854: results queue empty 40074 1727204646.92855: checking for any_errors_fatal 40074 1727204646.92861: done checking for any_errors_fatal 40074 1727204646.92862: checking for max_fail_percentage 40074 1727204646.92864: done checking for max_fail_percentage 40074 1727204646.92865: checking to see if all hosts have failed and the running result is not ok 40074 1727204646.92866: done checking to see if all hosts have failed 40074 1727204646.92867: getting the remaining hosts for this loop 40074 1727204646.92868: done getting the remaining hosts for this loop 40074 1727204646.92872: getting the next task for host managed-node2 40074 1727204646.92879: done getting next task for host managed-node2 40074 1727204646.92891: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 40074 1727204646.92897: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204646.92910: getting variables 40074 1727204646.92912: in VariableManager get_vars() 40074 1727204646.92953: Calling all_inventory to load vars for managed-node2 40074 1727204646.92956: Calling groups_inventory to load vars for managed-node2 40074 1727204646.92958: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204646.92968: Calling all_plugins_play to load vars for managed-node2 40074 1727204646.92971: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204646.92974: Calling groups_plugins_play to load vars for managed-node2 40074 1727204646.94208: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204646.95823: done with get_vars() 40074 1727204646.95846: done getting variables 40074 1727204646.95894: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 15:04:06 -0400 (0:00:00.051) 0:00:40.721 ***** 40074 1727204646.95924: entering _queue_task() for managed-node2/debug 40074 1727204646.96177: worker is 1 (out of 1 available) 40074 1727204646.96193: exiting _queue_task() for managed-node2/debug 40074 1727204646.96207: done queuing things up, now waiting for results queue to drain 40074 1727204646.96209: waiting for pending results... 40074 1727204646.96404: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 40074 1727204646.96503: in run() - task 12b410aa-8751-9fd7-2501-000000000654 40074 1727204646.96521: variable 'ansible_search_path' from source: unknown 40074 1727204646.96525: variable 'ansible_search_path' from source: unknown 40074 1727204646.96559: calling self._execute() 40074 1727204646.96646: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204646.96652: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204646.96667: variable 'omit' from source: magic vars 40074 1727204646.96990: variable 'ansible_distribution_major_version' from source: facts 40074 1727204646.96999: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204646.97007: variable 'omit' from source: magic vars 40074 1727204646.97055: variable 'omit' from source: magic vars 40074 1727204646.97083: variable 'omit' from source: magic vars 40074 1727204646.97123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204646.97155: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204646.97173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204646.97191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204646.97209: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204646.97234: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204646.97237: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204646.97242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204646.97330: Set connection var ansible_pipelining to False 40074 1727204646.97338: Set connection var ansible_shell_executable to /bin/sh 40074 1727204646.97341: Set connection var ansible_shell_type to sh 40074 1727204646.97343: Set connection var ansible_connection to ssh 40074 1727204646.97350: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204646.97357: Set connection var ansible_timeout to 10 40074 1727204646.97379: variable 'ansible_shell_executable' from source: unknown 40074 1727204646.97382: variable 'ansible_connection' from source: unknown 40074 1727204646.97386: variable 'ansible_module_compression' from source: unknown 40074 1727204646.97388: variable 'ansible_shell_type' from source: unknown 40074 1727204646.97394: variable 'ansible_shell_executable' from source: unknown 40074 1727204646.97398: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204646.97403: variable 'ansible_pipelining' from source: unknown 40074 1727204646.97406: variable 'ansible_timeout' from source: unknown 40074 1727204646.97411: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204646.97531: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204646.97543: variable 'omit' from source: magic vars 40074 1727204646.97550: starting attempt loop 40074 1727204646.97553: running the handler 40074 1727204646.97597: variable '__network_connections_result' from source: set_fact 40074 1727204646.97663: variable '__network_connections_result' from source: set_fact 40074 1727204646.97766: handler run complete 40074 1727204646.97790: attempt loop complete, returning result 40074 1727204646.97794: _execute() done 40074 1727204646.97798: dumping result to json 40074 1727204646.97803: done dumping result, returning 40074 1727204646.97812: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-9fd7-2501-000000000654] 40074 1727204646.97819: sending task result for task 12b410aa-8751-9fd7-2501-000000000654 40074 1727204646.97919: done sending task result for task 12b410aa-8751-9fd7-2501-000000000654 40074 1727204646.97922: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "ethtest0", "persistent_state": "absent", "state": "down" }, { "name": "ethtest1", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 40074 1727204646.98026: no more pending results, returning what we have 40074 1727204646.98030: results queue empty 40074 1727204646.98031: checking for any_errors_fatal 40074 1727204646.98037: done checking for any_errors_fatal 40074 1727204646.98038: checking for max_fail_percentage 40074 1727204646.98040: done checking for max_fail_percentage 40074 1727204646.98041: checking to see if all hosts have failed and the running result is not ok 40074 1727204646.98042: done checking to see if all hosts have failed 40074 1727204646.98043: getting the remaining hosts for this loop 40074 1727204646.98044: done getting the remaining hosts for this loop 40074 1727204646.98048: getting the next task for host managed-node2 40074 1727204646.98055: done getting next task for host managed-node2 40074 1727204646.98059: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 40074 1727204646.98062: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204646.98074: getting variables 40074 1727204646.98076: in VariableManager get_vars() 40074 1727204646.98126: Calling all_inventory to load vars for managed-node2 40074 1727204646.98129: Calling groups_inventory to load vars for managed-node2 40074 1727204646.98132: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204646.98143: Calling all_plugins_play to load vars for managed-node2 40074 1727204646.98145: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204646.98148: Calling groups_plugins_play to load vars for managed-node2 40074 1727204646.99512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204647.01125: done with get_vars() 40074 1727204647.01152: done getting variables 40074 1727204647.01201: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 15:04:07 -0400 (0:00:00.053) 0:00:40.774 ***** 40074 1727204647.01230: entering _queue_task() for managed-node2/debug 40074 1727204647.01483: worker is 1 (out of 1 available) 40074 1727204647.01503: exiting _queue_task() for managed-node2/debug 40074 1727204647.01515: done queuing things up, now waiting for results queue to drain 40074 1727204647.01520: waiting for pending results... 40074 1727204647.01706: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 40074 1727204647.01820: in run() - task 12b410aa-8751-9fd7-2501-000000000655 40074 1727204647.01834: variable 'ansible_search_path' from source: unknown 40074 1727204647.01838: variable 'ansible_search_path' from source: unknown 40074 1727204647.01873: calling self._execute() 40074 1727204647.01961: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204647.01972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204647.01980: variable 'omit' from source: magic vars 40074 1727204647.02303: variable 'ansible_distribution_major_version' from source: facts 40074 1727204647.02314: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204647.02422: variable 'network_state' from source: role '' defaults 40074 1727204647.02429: Evaluated conditional (network_state != {}): False 40074 1727204647.02433: when evaluation is False, skipping this task 40074 1727204647.02436: _execute() done 40074 1727204647.02442: dumping result to json 40074 1727204647.02446: done dumping result, returning 40074 1727204647.02455: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-9fd7-2501-000000000655] 40074 1727204647.02460: sending task result for task 12b410aa-8751-9fd7-2501-000000000655 40074 1727204647.02554: done sending task result for task 12b410aa-8751-9fd7-2501-000000000655 40074 1727204647.02556: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 40074 1727204647.02608: no more pending results, returning what we have 40074 1727204647.02612: results queue empty 40074 1727204647.02613: checking for any_errors_fatal 40074 1727204647.02627: done checking for any_errors_fatal 40074 1727204647.02628: checking for max_fail_percentage 40074 1727204647.02629: done checking for max_fail_percentage 40074 1727204647.02630: checking to see if all hosts have failed and the running result is not ok 40074 1727204647.02631: done checking to see if all hosts have failed 40074 1727204647.02632: getting the remaining hosts for this loop 40074 1727204647.02634: done getting the remaining hosts for this loop 40074 1727204647.02638: getting the next task for host managed-node2 40074 1727204647.02645: done getting next task for host managed-node2 40074 1727204647.02649: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 40074 1727204647.02653: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204647.02673: getting variables 40074 1727204647.02674: in VariableManager get_vars() 40074 1727204647.02714: Calling all_inventory to load vars for managed-node2 40074 1727204647.02720: Calling groups_inventory to load vars for managed-node2 40074 1727204647.02722: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204647.02732: Calling all_plugins_play to load vars for managed-node2 40074 1727204647.02735: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204647.02738: Calling groups_plugins_play to load vars for managed-node2 40074 1727204647.04088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204647.05696: done with get_vars() 40074 1727204647.05723: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 15:04:07 -0400 (0:00:00.045) 0:00:40.819 ***** 40074 1727204647.05802: entering _queue_task() for managed-node2/ping 40074 1727204647.06057: worker is 1 (out of 1 available) 40074 1727204647.06073: exiting _queue_task() for managed-node2/ping 40074 1727204647.06087: done queuing things up, now waiting for results queue to drain 40074 1727204647.06088: waiting for pending results... 40074 1727204647.06277: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 40074 1727204647.06384: in run() - task 12b410aa-8751-9fd7-2501-000000000656 40074 1727204647.06398: variable 'ansible_search_path' from source: unknown 40074 1727204647.06402: variable 'ansible_search_path' from source: unknown 40074 1727204647.06438: calling self._execute() 40074 1727204647.06526: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204647.06531: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204647.06546: variable 'omit' from source: magic vars 40074 1727204647.06869: variable 'ansible_distribution_major_version' from source: facts 40074 1727204647.06875: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204647.06885: variable 'omit' from source: magic vars 40074 1727204647.06935: variable 'omit' from source: magic vars 40074 1727204647.06962: variable 'omit' from source: magic vars 40074 1727204647.07002: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204647.07036: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204647.07053: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204647.07070: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204647.07087: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204647.07114: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204647.07120: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204647.07123: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204647.07211: Set connection var ansible_pipelining to False 40074 1727204647.07220: Set connection var ansible_shell_executable to /bin/sh 40074 1727204647.07224: Set connection var ansible_shell_type to sh 40074 1727204647.07226: Set connection var ansible_connection to ssh 40074 1727204647.07232: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204647.07238: Set connection var ansible_timeout to 10 40074 1727204647.07261: variable 'ansible_shell_executable' from source: unknown 40074 1727204647.07264: variable 'ansible_connection' from source: unknown 40074 1727204647.07267: variable 'ansible_module_compression' from source: unknown 40074 1727204647.07270: variable 'ansible_shell_type' from source: unknown 40074 1727204647.07274: variable 'ansible_shell_executable' from source: unknown 40074 1727204647.07278: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204647.07283: variable 'ansible_pipelining' from source: unknown 40074 1727204647.07287: variable 'ansible_timeout' from source: unknown 40074 1727204647.07294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204647.07468: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204647.07479: variable 'omit' from source: magic vars 40074 1727204647.07485: starting attempt loop 40074 1727204647.07488: running the handler 40074 1727204647.07504: _low_level_execute_command(): starting 40074 1727204647.07511: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204647.08070: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204647.08074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 40074 1727204647.08077: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204647.08080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.08135: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204647.08139: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204647.08193: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204647.10002: stdout chunk (state=3): >>>/root <<< 40074 1727204647.10116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204647.10175: stderr chunk (state=3): >>><<< 40074 1727204647.10178: stdout chunk (state=3): >>><<< 40074 1727204647.10203: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204647.10215: _low_level_execute_command(): starting 40074 1727204647.10224: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790 `" && echo ansible-tmp-1727204647.1020315-41899-100814571057790="` echo /root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790 `" ) && sleep 0' 40074 1727204647.10712: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204647.10715: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.10718: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204647.10727: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.10783: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204647.10790: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204647.10827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204647.12945: stdout chunk (state=3): >>>ansible-tmp-1727204647.1020315-41899-100814571057790=/root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790 <<< 40074 1727204647.13065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204647.13120: stderr chunk (state=3): >>><<< 40074 1727204647.13125: stdout chunk (state=3): >>><<< 40074 1727204647.13146: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204647.1020315-41899-100814571057790=/root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204647.13191: variable 'ansible_module_compression' from source: unknown 40074 1727204647.13232: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 40074 1727204647.13267: variable 'ansible_facts' from source: unknown 40074 1727204647.13328: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790/AnsiballZ_ping.py 40074 1727204647.13446: Sending initial data 40074 1727204647.13450: Sent initial data (153 bytes) 40074 1727204647.13924: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204647.13927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204647.13930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204647.13932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.13986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204647.13992: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204647.14037: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204647.15747: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 40074 1727204647.15756: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204647.15785: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204647.15820: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpj7g887yy /root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790/AnsiballZ_ping.py <<< 40074 1727204647.15828: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790/AnsiballZ_ping.py" <<< 40074 1727204647.15856: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpj7g887yy" to remote "/root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790/AnsiballZ_ping.py" <<< 40074 1727204647.16613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204647.16680: stderr chunk (state=3): >>><<< 40074 1727204647.16683: stdout chunk (state=3): >>><<< 40074 1727204647.16704: done transferring module to remote 40074 1727204647.16715: _low_level_execute_command(): starting 40074 1727204647.16721: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790/ /root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790/AnsiballZ_ping.py && sleep 0' 40074 1727204647.17187: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204647.17191: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.17194: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204647.17204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.17253: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204647.17262: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204647.17301: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204647.19270: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204647.19320: stderr chunk (state=3): >>><<< 40074 1727204647.19326: stdout chunk (state=3): >>><<< 40074 1727204647.19343: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204647.19348: _low_level_execute_command(): starting 40074 1727204647.19358: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790/AnsiballZ_ping.py && sleep 0' 40074 1727204647.19827: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204647.19831: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204647.19833: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204647.19836: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.19881: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204647.19885: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204647.19942: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204647.37401: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 40074 1727204647.38804: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204647.38863: stderr chunk (state=3): >>><<< 40074 1727204647.38866: stdout chunk (state=3): >>><<< 40074 1727204647.38882: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204647.38916: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204647.38926: _low_level_execute_command(): starting 40074 1727204647.38931: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204647.1020315-41899-100814571057790/ > /dev/null 2>&1 && sleep 0' 40074 1727204647.39393: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204647.39410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204647.39414: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.39430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.39496: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204647.39501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204647.39503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204647.39541: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204647.41492: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204647.41542: stderr chunk (state=3): >>><<< 40074 1727204647.41545: stdout chunk (state=3): >>><<< 40074 1727204647.41560: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204647.41569: handler run complete 40074 1727204647.41586: attempt loop complete, returning result 40074 1727204647.41590: _execute() done 40074 1727204647.41595: dumping result to json 40074 1727204647.41600: done dumping result, returning 40074 1727204647.41613: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-9fd7-2501-000000000656] 40074 1727204647.41618: sending task result for task 12b410aa-8751-9fd7-2501-000000000656 40074 1727204647.41715: done sending task result for task 12b410aa-8751-9fd7-2501-000000000656 40074 1727204647.41718: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 40074 1727204647.41794: no more pending results, returning what we have 40074 1727204647.41797: results queue empty 40074 1727204647.41798: checking for any_errors_fatal 40074 1727204647.41805: done checking for any_errors_fatal 40074 1727204647.41806: checking for max_fail_percentage 40074 1727204647.41808: done checking for max_fail_percentage 40074 1727204647.41808: checking to see if all hosts have failed and the running result is not ok 40074 1727204647.41810: done checking to see if all hosts have failed 40074 1727204647.41811: getting the remaining hosts for this loop 40074 1727204647.41812: done getting the remaining hosts for this loop 40074 1727204647.41816: getting the next task for host managed-node2 40074 1727204647.41827: done getting next task for host managed-node2 40074 1727204647.41837: ^ task is: TASK: meta (role_complete) 40074 1727204647.41841: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204647.41854: getting variables 40074 1727204647.41856: in VariableManager get_vars() 40074 1727204647.41905: Calling all_inventory to load vars for managed-node2 40074 1727204647.41908: Calling groups_inventory to load vars for managed-node2 40074 1727204647.41910: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204647.41921: Calling all_plugins_play to load vars for managed-node2 40074 1727204647.41925: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204647.41929: Calling groups_plugins_play to load vars for managed-node2 40074 1727204647.43194: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204647.44795: done with get_vars() 40074 1727204647.44819: done getting variables 40074 1727204647.44892: done queuing things up, now waiting for results queue to drain 40074 1727204647.44894: results queue empty 40074 1727204647.44895: checking for any_errors_fatal 40074 1727204647.44897: done checking for any_errors_fatal 40074 1727204647.44898: checking for max_fail_percentage 40074 1727204647.44898: done checking for max_fail_percentage 40074 1727204647.44899: checking to see if all hosts have failed and the running result is not ok 40074 1727204647.44900: done checking to see if all hosts have failed 40074 1727204647.44900: getting the remaining hosts for this loop 40074 1727204647.44901: done getting the remaining hosts for this loop 40074 1727204647.44903: getting the next task for host managed-node2 40074 1727204647.44906: done getting next task for host managed-node2 40074 1727204647.44908: ^ task is: TASK: Delete interface1 40074 1727204647.44910: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204647.44912: getting variables 40074 1727204647.44913: in VariableManager get_vars() 40074 1727204647.44926: Calling all_inventory to load vars for managed-node2 40074 1727204647.44927: Calling groups_inventory to load vars for managed-node2 40074 1727204647.44929: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204647.44933: Calling all_plugins_play to load vars for managed-node2 40074 1727204647.44935: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204647.44937: Calling groups_plugins_play to load vars for managed-node2 40074 1727204647.46123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204647.47715: done with get_vars() 40074 1727204647.47737: done getting variables TASK [Delete interface1] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:151 Tuesday 24 September 2024 15:04:07 -0400 (0:00:00.419) 0:00:41.239 ***** 40074 1727204647.47804: entering _queue_task() for managed-node2/include_tasks 40074 1727204647.48141: worker is 1 (out of 1 available) 40074 1727204647.48156: exiting _queue_task() for managed-node2/include_tasks 40074 1727204647.48172: done queuing things up, now waiting for results queue to drain 40074 1727204647.48173: waiting for pending results... 40074 1727204647.48375: running TaskExecutor() for managed-node2/TASK: Delete interface1 40074 1727204647.48475: in run() - task 12b410aa-8751-9fd7-2501-0000000000b5 40074 1727204647.48488: variable 'ansible_search_path' from source: unknown 40074 1727204647.48528: calling self._execute() 40074 1727204647.48617: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204647.48627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204647.48642: variable 'omit' from source: magic vars 40074 1727204647.48994: variable 'ansible_distribution_major_version' from source: facts 40074 1727204647.49005: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204647.49012: _execute() done 40074 1727204647.49020: dumping result to json 40074 1727204647.49023: done dumping result, returning 40074 1727204647.49029: done running TaskExecutor() for managed-node2/TASK: Delete interface1 [12b410aa-8751-9fd7-2501-0000000000b5] 40074 1727204647.49034: sending task result for task 12b410aa-8751-9fd7-2501-0000000000b5 40074 1727204647.49138: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000b5 40074 1727204647.49141: WORKER PROCESS EXITING 40074 1727204647.49185: no more pending results, returning what we have 40074 1727204647.49191: in VariableManager get_vars() 40074 1727204647.49243: Calling all_inventory to load vars for managed-node2 40074 1727204647.49246: Calling groups_inventory to load vars for managed-node2 40074 1727204647.49249: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204647.49263: Calling all_plugins_play to load vars for managed-node2 40074 1727204647.49266: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204647.49270: Calling groups_plugins_play to load vars for managed-node2 40074 1727204647.50645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204647.52447: done with get_vars() 40074 1727204647.52470: variable 'ansible_search_path' from source: unknown 40074 1727204647.52482: we have included files to process 40074 1727204647.52482: generating all_blocks data 40074 1727204647.52484: done generating all_blocks data 40074 1727204647.52487: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 40074 1727204647.52488: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 40074 1727204647.52492: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 40074 1727204647.52679: done processing included file 40074 1727204647.52682: iterating over new_blocks loaded from include file 40074 1727204647.52683: in VariableManager get_vars() 40074 1727204647.52702: done with get_vars() 40074 1727204647.52703: filtering new block on tags 40074 1727204647.52727: done filtering new block on tags 40074 1727204647.52729: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node2 40074 1727204647.52733: extending task lists for all hosts with included blocks 40074 1727204647.53727: done extending task lists 40074 1727204647.53729: done processing included files 40074 1727204647.53729: results queue empty 40074 1727204647.53730: checking for any_errors_fatal 40074 1727204647.53731: done checking for any_errors_fatal 40074 1727204647.53732: checking for max_fail_percentage 40074 1727204647.53733: done checking for max_fail_percentage 40074 1727204647.53733: checking to see if all hosts have failed and the running result is not ok 40074 1727204647.53734: done checking to see if all hosts have failed 40074 1727204647.53735: getting the remaining hosts for this loop 40074 1727204647.53735: done getting the remaining hosts for this loop 40074 1727204647.53737: getting the next task for host managed-node2 40074 1727204647.53741: done getting next task for host managed-node2 40074 1727204647.53742: ^ task is: TASK: Remove test interface if necessary 40074 1727204647.53744: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204647.53747: getting variables 40074 1727204647.53747: in VariableManager get_vars() 40074 1727204647.53758: Calling all_inventory to load vars for managed-node2 40074 1727204647.53760: Calling groups_inventory to load vars for managed-node2 40074 1727204647.53761: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204647.53766: Calling all_plugins_play to load vars for managed-node2 40074 1727204647.53769: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204647.53772: Calling groups_plugins_play to load vars for managed-node2 40074 1727204647.55853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204647.59051: done with get_vars() 40074 1727204647.59101: done getting variables 40074 1727204647.59159: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 15:04:07 -0400 (0:00:00.114) 0:00:41.353 ***** 40074 1727204647.59211: entering _queue_task() for managed-node2/command 40074 1727204647.59631: worker is 1 (out of 1 available) 40074 1727204647.59646: exiting _queue_task() for managed-node2/command 40074 1727204647.59662: done queuing things up, now waiting for results queue to drain 40074 1727204647.59663: waiting for pending results... 40074 1727204647.60075: running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary 40074 1727204647.60166: in run() - task 12b410aa-8751-9fd7-2501-000000000777 40074 1727204647.60180: variable 'ansible_search_path' from source: unknown 40074 1727204647.60184: variable 'ansible_search_path' from source: unknown 40074 1727204647.60224: calling self._execute() 40074 1727204647.60312: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204647.60321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204647.60330: variable 'omit' from source: magic vars 40074 1727204647.60656: variable 'ansible_distribution_major_version' from source: facts 40074 1727204647.60668: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204647.60674: variable 'omit' from source: magic vars 40074 1727204647.60719: variable 'omit' from source: magic vars 40074 1727204647.60799: variable 'interface' from source: set_fact 40074 1727204647.60819: variable 'omit' from source: magic vars 40074 1727204647.60857: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204647.60888: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204647.60910: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204647.60928: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204647.60939: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204647.60970: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204647.60973: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204647.60975: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204647.61067: Set connection var ansible_pipelining to False 40074 1727204647.61071: Set connection var ansible_shell_executable to /bin/sh 40074 1727204647.61074: Set connection var ansible_shell_type to sh 40074 1727204647.61082: Set connection var ansible_connection to ssh 40074 1727204647.61085: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204647.61095: Set connection var ansible_timeout to 10 40074 1727204647.61119: variable 'ansible_shell_executable' from source: unknown 40074 1727204647.61123: variable 'ansible_connection' from source: unknown 40074 1727204647.61126: variable 'ansible_module_compression' from source: unknown 40074 1727204647.61129: variable 'ansible_shell_type' from source: unknown 40074 1727204647.61132: variable 'ansible_shell_executable' from source: unknown 40074 1727204647.61134: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204647.61139: variable 'ansible_pipelining' from source: unknown 40074 1727204647.61142: variable 'ansible_timeout' from source: unknown 40074 1727204647.61147: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204647.61265: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204647.61277: variable 'omit' from source: magic vars 40074 1727204647.61284: starting attempt loop 40074 1727204647.61287: running the handler 40074 1727204647.61308: _low_level_execute_command(): starting 40074 1727204647.61315: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204647.61943: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.61947: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204647.61951: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204647.62015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204647.62061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204647.64299: stdout chunk (state=3): >>>/root <<< 40074 1727204647.64303: stdout chunk (state=3): >>><<< 40074 1727204647.64306: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204647.64308: stderr chunk (state=3): >>><<< 40074 1727204647.64310: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204647.64313: _low_level_execute_command(): starting 40074 1727204647.64316: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163 `" && echo ansible-tmp-1727204647.6415117-41909-159707633181163="` echo /root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163 `" ) && sleep 0' 40074 1727204647.65462: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204647.65480: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204647.65504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204647.65607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204647.65633: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204647.65712: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204647.67775: stdout chunk (state=3): >>>ansible-tmp-1727204647.6415117-41909-159707633181163=/root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163 <<< 40074 1727204647.67935: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204647.68024: stderr chunk (state=3): >>><<< 40074 1727204647.68030: stdout chunk (state=3): >>><<< 40074 1727204647.68063: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204647.6415117-41909-159707633181163=/root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204647.68104: variable 'ansible_module_compression' from source: unknown 40074 1727204647.68214: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204647.68332: variable 'ansible_facts' from source: unknown 40074 1727204647.68439: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163/AnsiballZ_command.py 40074 1727204647.68694: Sending initial data 40074 1727204647.68698: Sent initial data (156 bytes) 40074 1727204647.69306: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.69343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204647.69359: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204647.69374: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204647.69447: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204647.71179: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204647.71239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204647.71274: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163/AnsiballZ_command.py" <<< 40074 1727204647.71311: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpljqrx6r5 /root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163/AnsiballZ_command.py <<< 40074 1727204647.71345: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpljqrx6r5" to remote "/root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163/AnsiballZ_command.py" <<< 40074 1727204647.72470: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204647.72511: stderr chunk (state=3): >>><<< 40074 1727204647.72522: stdout chunk (state=3): >>><<< 40074 1727204647.72552: done transferring module to remote 40074 1727204647.72580: _low_level_execute_command(): starting 40074 1727204647.72592: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163/ /root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163/AnsiballZ_command.py && sleep 0' 40074 1727204647.73264: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204647.73279: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204647.73294: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204647.73314: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204647.73343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204647.73407: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.73486: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204647.73506: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204647.73527: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204647.73608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204647.75780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204647.75852: stderr chunk (state=3): >>><<< 40074 1727204647.75862: stdout chunk (state=3): >>><<< 40074 1727204647.75906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204647.75923: _low_level_execute_command(): starting 40074 1727204647.75935: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163/AnsiballZ_command.py && sleep 0' 40074 1727204647.76606: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204647.76621: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204647.76743: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204647.76765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 40074 1727204647.76788: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204647.76809: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204647.76892: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204647.96259: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-24 15:04:07.946698", "end": "2024-09-24 15:04:07.956172", "delta": "0:00:00.009474", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204647.98897: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204647.98901: stderr chunk (state=3): >>><<< 40074 1727204647.98904: stdout chunk (state=3): >>><<< 40074 1727204647.98906: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest1"], "start": "2024-09-24 15:04:07.946698", "end": "2024-09-24 15:04:07.956172", "delta": "0:00:00.009474", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204647.98960: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204647.98970: _low_level_execute_command(): starting 40074 1727204647.98976: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204647.6415117-41909-159707633181163/ > /dev/null 2>&1 && sleep 0' 40074 1727204647.99653: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204647.99665: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204647.99677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204647.99696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204647.99710: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204647.99718: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204647.99732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.99770: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204647.99774: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204647.99776: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 40074 1727204647.99779: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204647.99784: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204647.99880: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204647.99884: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204647.99886: stderr chunk (state=3): >>>debug2: match found <<< 40074 1727204647.99888: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204647.99908: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204647.99934: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204647.99944: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204648.00024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204648.06395: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204648.06399: stdout chunk (state=3): >>><<< 40074 1727204648.06402: stderr chunk (state=3): >>><<< 40074 1727204648.06405: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204648.06411: handler run complete 40074 1727204648.06441: Evaluated conditional (False): False 40074 1727204648.06452: attempt loop complete, returning result 40074 1727204648.06456: _execute() done 40074 1727204648.06460: dumping result to json 40074 1727204648.06467: done dumping result, returning 40074 1727204648.06477: done running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary [12b410aa-8751-9fd7-2501-000000000777] 40074 1727204648.06486: sending task result for task 12b410aa-8751-9fd7-2501-000000000777 40074 1727204648.06604: done sending task result for task 12b410aa-8751-9fd7-2501-000000000777 40074 1727204648.06608: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest1" ], "delta": "0:00:00.009474", "end": "2024-09-24 15:04:07.956172", "rc": 0, "start": "2024-09-24 15:04:07.946698" } 40074 1727204648.06831: no more pending results, returning what we have 40074 1727204648.06834: results queue empty 40074 1727204648.06836: checking for any_errors_fatal 40074 1727204648.06837: done checking for any_errors_fatal 40074 1727204648.06838: checking for max_fail_percentage 40074 1727204648.06839: done checking for max_fail_percentage 40074 1727204648.06840: checking to see if all hosts have failed and the running result is not ok 40074 1727204648.06842: done checking to see if all hosts have failed 40074 1727204648.06842: getting the remaining hosts for this loop 40074 1727204648.06844: done getting the remaining hosts for this loop 40074 1727204648.06848: getting the next task for host managed-node2 40074 1727204648.06857: done getting next task for host managed-node2 40074 1727204648.06860: ^ task is: TASK: Assert interface1 is absent 40074 1727204648.06864: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204648.06871: getting variables 40074 1727204648.06873: in VariableManager get_vars() 40074 1727204648.06939: Calling all_inventory to load vars for managed-node2 40074 1727204648.06943: Calling groups_inventory to load vars for managed-node2 40074 1727204648.06946: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204648.06960: Calling all_plugins_play to load vars for managed-node2 40074 1727204648.06964: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204648.06968: Calling groups_plugins_play to load vars for managed-node2 40074 1727204648.09227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204648.12124: done with get_vars() 40074 1727204648.12170: done getting variables TASK [Assert interface1 is absent] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:153 Tuesday 24 September 2024 15:04:08 -0400 (0:00:00.530) 0:00:41.884 ***** 40074 1727204648.12294: entering _queue_task() for managed-node2/include_tasks 40074 1727204648.12683: worker is 1 (out of 1 available) 40074 1727204648.12703: exiting _queue_task() for managed-node2/include_tasks 40074 1727204648.12718: done queuing things up, now waiting for results queue to drain 40074 1727204648.12720: waiting for pending results... 40074 1727204648.13076: running TaskExecutor() for managed-node2/TASK: Assert interface1 is absent 40074 1727204648.13215: in run() - task 12b410aa-8751-9fd7-2501-0000000000b6 40074 1727204648.13281: variable 'ansible_search_path' from source: unknown 40074 1727204648.13296: calling self._execute() 40074 1727204648.13420: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204648.13435: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204648.13452: variable 'omit' from source: magic vars 40074 1727204648.13934: variable 'ansible_distribution_major_version' from source: facts 40074 1727204648.13938: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204648.13946: _execute() done 40074 1727204648.13957: dumping result to json 40074 1727204648.13965: done dumping result, returning 40074 1727204648.13975: done running TaskExecutor() for managed-node2/TASK: Assert interface1 is absent [12b410aa-8751-9fd7-2501-0000000000b6] 40074 1727204648.14097: sending task result for task 12b410aa-8751-9fd7-2501-0000000000b6 40074 1727204648.14173: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000b6 40074 1727204648.14176: WORKER PROCESS EXITING 40074 1727204648.14230: no more pending results, returning what we have 40074 1727204648.14235: in VariableManager get_vars() 40074 1727204648.14286: Calling all_inventory to load vars for managed-node2 40074 1727204648.14292: Calling groups_inventory to load vars for managed-node2 40074 1727204648.14294: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204648.14310: Calling all_plugins_play to load vars for managed-node2 40074 1727204648.14314: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204648.14318: Calling groups_plugins_play to load vars for managed-node2 40074 1727204648.16756: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204648.19853: done with get_vars() 40074 1727204648.19887: variable 'ansible_search_path' from source: unknown 40074 1727204648.19906: we have included files to process 40074 1727204648.19907: generating all_blocks data 40074 1727204648.19909: done generating all_blocks data 40074 1727204648.19916: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 40074 1727204648.19919: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 40074 1727204648.19922: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 40074 1727204648.20130: in VariableManager get_vars() 40074 1727204648.20166: done with get_vars() 40074 1727204648.20309: done processing included file 40074 1727204648.20311: iterating over new_blocks loaded from include file 40074 1727204648.20313: in VariableManager get_vars() 40074 1727204648.20337: done with get_vars() 40074 1727204648.20339: filtering new block on tags 40074 1727204648.20385: done filtering new block on tags 40074 1727204648.20389: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 40074 1727204648.20396: extending task lists for all hosts with included blocks 40074 1727204648.22312: done extending task lists 40074 1727204648.22313: done processing included files 40074 1727204648.22314: results queue empty 40074 1727204648.22315: checking for any_errors_fatal 40074 1727204648.22325: done checking for any_errors_fatal 40074 1727204648.22326: checking for max_fail_percentage 40074 1727204648.22327: done checking for max_fail_percentage 40074 1727204648.22328: checking to see if all hosts have failed and the running result is not ok 40074 1727204648.22330: done checking to see if all hosts have failed 40074 1727204648.22331: getting the remaining hosts for this loop 40074 1727204648.22332: done getting the remaining hosts for this loop 40074 1727204648.22335: getting the next task for host managed-node2 40074 1727204648.22341: done getting next task for host managed-node2 40074 1727204648.22343: ^ task is: TASK: Include the task 'get_interface_stat.yml' 40074 1727204648.22353: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204648.22357: getting variables 40074 1727204648.22358: in VariableManager get_vars() 40074 1727204648.22377: Calling all_inventory to load vars for managed-node2 40074 1727204648.22380: Calling groups_inventory to load vars for managed-node2 40074 1727204648.22383: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204648.22391: Calling all_plugins_play to load vars for managed-node2 40074 1727204648.22394: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204648.22398: Calling groups_plugins_play to load vars for managed-node2 40074 1727204648.24634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204648.27930: done with get_vars() 40074 1727204648.27962: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:04:08 -0400 (0:00:00.157) 0:00:42.042 ***** 40074 1727204648.28058: entering _queue_task() for managed-node2/include_tasks 40074 1727204648.28643: worker is 1 (out of 1 available) 40074 1727204648.28657: exiting _queue_task() for managed-node2/include_tasks 40074 1727204648.28671: done queuing things up, now waiting for results queue to drain 40074 1727204648.28673: waiting for pending results... 40074 1727204648.28897: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 40074 1727204648.29096: in run() - task 12b410aa-8751-9fd7-2501-000000000816 40074 1727204648.29101: variable 'ansible_search_path' from source: unknown 40074 1727204648.29105: variable 'ansible_search_path' from source: unknown 40074 1727204648.29149: calling self._execute() 40074 1727204648.29279: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204648.29296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204648.29320: variable 'omit' from source: magic vars 40074 1727204648.29810: variable 'ansible_distribution_major_version' from source: facts 40074 1727204648.29833: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204648.29847: _execute() done 40074 1727204648.29858: dumping result to json 40074 1727204648.29868: done dumping result, returning 40074 1727204648.29891: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-9fd7-2501-000000000816] 40074 1727204648.29904: sending task result for task 12b410aa-8751-9fd7-2501-000000000816 40074 1727204648.30167: no more pending results, returning what we have 40074 1727204648.30173: in VariableManager get_vars() 40074 1727204648.30239: Calling all_inventory to load vars for managed-node2 40074 1727204648.30243: Calling groups_inventory to load vars for managed-node2 40074 1727204648.30246: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204648.30254: done sending task result for task 12b410aa-8751-9fd7-2501-000000000816 40074 1727204648.30257: WORKER PROCESS EXITING 40074 1727204648.30273: Calling all_plugins_play to load vars for managed-node2 40074 1727204648.30277: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204648.30281: Calling groups_plugins_play to load vars for managed-node2 40074 1727204648.37937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204648.41135: done with get_vars() 40074 1727204648.41171: variable 'ansible_search_path' from source: unknown 40074 1727204648.41173: variable 'ansible_search_path' from source: unknown 40074 1727204648.41221: we have included files to process 40074 1727204648.41223: generating all_blocks data 40074 1727204648.41225: done generating all_blocks data 40074 1727204648.41226: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 40074 1727204648.41227: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 40074 1727204648.41230: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 40074 1727204648.41452: done processing included file 40074 1727204648.41455: iterating over new_blocks loaded from include file 40074 1727204648.41457: in VariableManager get_vars() 40074 1727204648.41481: done with get_vars() 40074 1727204648.41483: filtering new block on tags 40074 1727204648.41524: done filtering new block on tags 40074 1727204648.41527: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 40074 1727204648.41533: extending task lists for all hosts with included blocks 40074 1727204648.41706: done extending task lists 40074 1727204648.41708: done processing included files 40074 1727204648.41709: results queue empty 40074 1727204648.41710: checking for any_errors_fatal 40074 1727204648.41713: done checking for any_errors_fatal 40074 1727204648.41714: checking for max_fail_percentage 40074 1727204648.41715: done checking for max_fail_percentage 40074 1727204648.41719: checking to see if all hosts have failed and the running result is not ok 40074 1727204648.41720: done checking to see if all hosts have failed 40074 1727204648.41721: getting the remaining hosts for this loop 40074 1727204648.41722: done getting the remaining hosts for this loop 40074 1727204648.41725: getting the next task for host managed-node2 40074 1727204648.41735: done getting next task for host managed-node2 40074 1727204648.41738: ^ task is: TASK: Get stat for interface {{ interface }} 40074 1727204648.41742: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204648.41745: getting variables 40074 1727204648.41746: in VariableManager get_vars() 40074 1727204648.41762: Calling all_inventory to load vars for managed-node2 40074 1727204648.41765: Calling groups_inventory to load vars for managed-node2 40074 1727204648.41768: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204648.41775: Calling all_plugins_play to load vars for managed-node2 40074 1727204648.41778: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204648.41781: Calling groups_plugins_play to load vars for managed-node2 40074 1727204648.43916: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204648.47050: done with get_vars() 40074 1727204648.47091: done getting variables 40074 1727204648.47276: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest1] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:04:08 -0400 (0:00:00.192) 0:00:42.234 ***** 40074 1727204648.47320: entering _queue_task() for managed-node2/stat 40074 1727204648.47897: worker is 1 (out of 1 available) 40074 1727204648.47910: exiting _queue_task() for managed-node2/stat 40074 1727204648.47926: done queuing things up, now waiting for results queue to drain 40074 1727204648.47927: waiting for pending results... 40074 1727204648.48170: running TaskExecutor() for managed-node2/TASK: Get stat for interface ethtest1 40074 1727204648.48287: in run() - task 12b410aa-8751-9fd7-2501-0000000008bc 40074 1727204648.48313: variable 'ansible_search_path' from source: unknown 40074 1727204648.48326: variable 'ansible_search_path' from source: unknown 40074 1727204648.48379: calling self._execute() 40074 1727204648.48507: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204648.48525: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204648.48543: variable 'omit' from source: magic vars 40074 1727204648.49043: variable 'ansible_distribution_major_version' from source: facts 40074 1727204648.49063: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204648.49075: variable 'omit' from source: magic vars 40074 1727204648.49163: variable 'omit' from source: magic vars 40074 1727204648.49353: variable 'interface' from source: set_fact 40074 1727204648.49361: variable 'omit' from source: magic vars 40074 1727204648.49384: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204648.49437: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204648.49474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204648.49504: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204648.49529: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204648.49580: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204648.49593: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204648.49680: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204648.49759: Set connection var ansible_pipelining to False 40074 1727204648.49775: Set connection var ansible_shell_executable to /bin/sh 40074 1727204648.49788: Set connection var ansible_shell_type to sh 40074 1727204648.49804: Set connection var ansible_connection to ssh 40074 1727204648.49820: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204648.49835: Set connection var ansible_timeout to 10 40074 1727204648.49873: variable 'ansible_shell_executable' from source: unknown 40074 1727204648.49883: variable 'ansible_connection' from source: unknown 40074 1727204648.49896: variable 'ansible_module_compression' from source: unknown 40074 1727204648.49910: variable 'ansible_shell_type' from source: unknown 40074 1727204648.49923: variable 'ansible_shell_executable' from source: unknown 40074 1727204648.49932: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204648.49943: variable 'ansible_pipelining' from source: unknown 40074 1727204648.49953: variable 'ansible_timeout' from source: unknown 40074 1727204648.49995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204648.50255: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204648.50277: variable 'omit' from source: magic vars 40074 1727204648.50293: starting attempt loop 40074 1727204648.50335: running the handler 40074 1727204648.50343: _low_level_execute_command(): starting 40074 1727204648.50346: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204648.51224: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204648.51298: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204648.51350: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204648.51353: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204648.51443: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204648.53264: stdout chunk (state=3): >>>/root <<< 40074 1727204648.53484: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204648.53488: stdout chunk (state=3): >>><<< 40074 1727204648.53494: stderr chunk (state=3): >>><<< 40074 1727204648.53515: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204648.53540: _low_level_execute_command(): starting 40074 1727204648.53636: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586 `" && echo ansible-tmp-1727204648.535255-41934-76853882269586="` echo /root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586 `" ) && sleep 0' 40074 1727204648.54258: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204648.54274: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204648.54292: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204648.54311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204648.54349: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204648.54362: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204648.54472: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204648.54501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204648.54513: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204648.54538: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204648.54613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204648.56706: stdout chunk (state=3): >>>ansible-tmp-1727204648.535255-41934-76853882269586=/root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586 <<< 40074 1727204648.56922: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204648.56925: stdout chunk (state=3): >>><<< 40074 1727204648.56927: stderr chunk (state=3): >>><<< 40074 1727204648.56944: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204648.535255-41934-76853882269586=/root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204648.57099: variable 'ansible_module_compression' from source: unknown 40074 1727204648.57102: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 40074 1727204648.57125: variable 'ansible_facts' from source: unknown 40074 1727204648.57225: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586/AnsiballZ_stat.py 40074 1727204648.57451: Sending initial data 40074 1727204648.57455: Sent initial data (151 bytes) 40074 1727204648.58092: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204648.58102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204648.58173: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204648.59925: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204648.59966: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204648.60005: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpfn7cy5lh /root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586/AnsiballZ_stat.py <<< 40074 1727204648.60009: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586/AnsiballZ_stat.py" <<< 40074 1727204648.60073: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpfn7cy5lh" to remote "/root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586/AnsiballZ_stat.py" <<< 40074 1727204648.61509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204648.61542: stderr chunk (state=3): >>><<< 40074 1727204648.61558: stdout chunk (state=3): >>><<< 40074 1727204648.61586: done transferring module to remote 40074 1727204648.61617: _low_level_execute_command(): starting 40074 1727204648.61620: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586/ /root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586/AnsiballZ_stat.py && sleep 0' 40074 1727204648.62306: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204648.62385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204648.62412: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204648.62456: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204648.62486: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204648.64551: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204648.64554: stdout chunk (state=3): >>><<< 40074 1727204648.64556: stderr chunk (state=3): >>><<< 40074 1727204648.64595: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204648.64598: _low_level_execute_command(): starting 40074 1727204648.64601: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586/AnsiballZ_stat.py && sleep 0' 40074 1727204648.65235: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204648.65256: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204648.65371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 40074 1727204648.65403: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204648.65422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204648.65508: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204648.83105: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 40074 1727204648.84705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204648.84709: stdout chunk (state=3): >>><<< 40074 1727204648.84717: stderr chunk (state=3): >>><<< 40074 1727204648.84753: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204648.84791: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204648.84806: _low_level_execute_command(): starting 40074 1727204648.84810: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204648.535255-41934-76853882269586/ > /dev/null 2>&1 && sleep 0' 40074 1727204648.85510: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204648.85514: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204648.85533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204648.85575: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204648.85578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204648.85675: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204648.85743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204648.85746: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204648.85753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204648.85800: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204648.87814: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204648.87916: stderr chunk (state=3): >>><<< 40074 1727204648.87937: stdout chunk (state=3): >>><<< 40074 1727204648.88096: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204648.88106: handler run complete 40074 1727204648.88108: attempt loop complete, returning result 40074 1727204648.88111: _execute() done 40074 1727204648.88113: dumping result to json 40074 1727204648.88115: done dumping result, returning 40074 1727204648.88121: done running TaskExecutor() for managed-node2/TASK: Get stat for interface ethtest1 [12b410aa-8751-9fd7-2501-0000000008bc] 40074 1727204648.88123: sending task result for task 12b410aa-8751-9fd7-2501-0000000008bc ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 40074 1727204648.88288: no more pending results, returning what we have 40074 1727204648.88295: results queue empty 40074 1727204648.88296: checking for any_errors_fatal 40074 1727204648.88297: done checking for any_errors_fatal 40074 1727204648.88298: checking for max_fail_percentage 40074 1727204648.88300: done checking for max_fail_percentage 40074 1727204648.88301: checking to see if all hosts have failed and the running result is not ok 40074 1727204648.88420: done checking to see if all hosts have failed 40074 1727204648.88422: getting the remaining hosts for this loop 40074 1727204648.88424: done getting the remaining hosts for this loop 40074 1727204648.88429: getting the next task for host managed-node2 40074 1727204648.88440: done getting next task for host managed-node2 40074 1727204648.88443: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 40074 1727204648.88450: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204648.88455: getting variables 40074 1727204648.88457: in VariableManager get_vars() 40074 1727204648.88514: Calling all_inventory to load vars for managed-node2 40074 1727204648.88594: Calling groups_inventory to load vars for managed-node2 40074 1727204648.88599: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204648.88621: Calling all_plugins_play to load vars for managed-node2 40074 1727204648.88626: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204648.88636: Calling groups_plugins_play to load vars for managed-node2 40074 1727204648.89255: done sending task result for task 12b410aa-8751-9fd7-2501-0000000008bc 40074 1727204648.89263: WORKER PROCESS EXITING 40074 1727204648.91308: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204648.93244: done with get_vars() 40074 1727204648.93267: done getting variables 40074 1727204648.93324: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204648.93436: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest1'] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:04:08 -0400 (0:00:00.461) 0:00:42.696 ***** 40074 1727204648.93469: entering _queue_task() for managed-node2/assert 40074 1727204648.93855: worker is 1 (out of 1 available) 40074 1727204648.93869: exiting _queue_task() for managed-node2/assert 40074 1727204648.93884: done queuing things up, now waiting for results queue to drain 40074 1727204648.93886: waiting for pending results... 40074 1727204648.94225: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'ethtest1' 40074 1727204648.94379: in run() - task 12b410aa-8751-9fd7-2501-000000000817 40074 1727204648.94405: variable 'ansible_search_path' from source: unknown 40074 1727204648.94415: variable 'ansible_search_path' from source: unknown 40074 1727204648.94496: calling self._execute() 40074 1727204648.94586: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204648.94604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204648.94635: variable 'omit' from source: magic vars 40074 1727204648.95000: variable 'ansible_distribution_major_version' from source: facts 40074 1727204648.95009: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204648.95016: variable 'omit' from source: magic vars 40074 1727204648.95057: variable 'omit' from source: magic vars 40074 1727204648.95140: variable 'interface' from source: set_fact 40074 1727204648.95158: variable 'omit' from source: magic vars 40074 1727204648.95194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204648.95230: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204648.95248: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204648.95267: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204648.95277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204648.95309: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204648.95312: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204648.95316: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204648.95405: Set connection var ansible_pipelining to False 40074 1727204648.95412: Set connection var ansible_shell_executable to /bin/sh 40074 1727204648.95415: Set connection var ansible_shell_type to sh 40074 1727204648.95417: Set connection var ansible_connection to ssh 40074 1727204648.95428: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204648.95437: Set connection var ansible_timeout to 10 40074 1727204648.95458: variable 'ansible_shell_executable' from source: unknown 40074 1727204648.95462: variable 'ansible_connection' from source: unknown 40074 1727204648.95465: variable 'ansible_module_compression' from source: unknown 40074 1727204648.95468: variable 'ansible_shell_type' from source: unknown 40074 1727204648.95472: variable 'ansible_shell_executable' from source: unknown 40074 1727204648.95475: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204648.95484: variable 'ansible_pipelining' from source: unknown 40074 1727204648.95486: variable 'ansible_timeout' from source: unknown 40074 1727204648.95489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204648.95611: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204648.95626: variable 'omit' from source: magic vars 40074 1727204648.95633: starting attempt loop 40074 1727204648.95636: running the handler 40074 1727204648.95761: variable 'interface_stat' from source: set_fact 40074 1727204648.95770: Evaluated conditional (not interface_stat.stat.exists): True 40074 1727204648.95778: handler run complete 40074 1727204648.95794: attempt loop complete, returning result 40074 1727204648.95798: _execute() done 40074 1727204648.95800: dumping result to json 40074 1727204648.95805: done dumping result, returning 40074 1727204648.95813: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'ethtest1' [12b410aa-8751-9fd7-2501-000000000817] 40074 1727204648.95819: sending task result for task 12b410aa-8751-9fd7-2501-000000000817 40074 1727204648.95912: done sending task result for task 12b410aa-8751-9fd7-2501-000000000817 40074 1727204648.95915: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 40074 1727204648.95971: no more pending results, returning what we have 40074 1727204648.95975: results queue empty 40074 1727204648.95976: checking for any_errors_fatal 40074 1727204648.95993: done checking for any_errors_fatal 40074 1727204648.95994: checking for max_fail_percentage 40074 1727204648.95996: done checking for max_fail_percentage 40074 1727204648.95997: checking to see if all hosts have failed and the running result is not ok 40074 1727204648.95999: done checking to see if all hosts have failed 40074 1727204648.96000: getting the remaining hosts for this loop 40074 1727204648.96001: done getting the remaining hosts for this loop 40074 1727204648.96006: getting the next task for host managed-node2 40074 1727204648.96015: done getting next task for host managed-node2 40074 1727204648.96018: ^ task is: TASK: Set interface0 40074 1727204648.96022: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204648.96031: getting variables 40074 1727204648.96033: in VariableManager get_vars() 40074 1727204648.96074: Calling all_inventory to load vars for managed-node2 40074 1727204648.96077: Calling groups_inventory to load vars for managed-node2 40074 1727204648.96080: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204648.96092: Calling all_plugins_play to load vars for managed-node2 40074 1727204648.96096: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204648.96099: Calling groups_plugins_play to load vars for managed-node2 40074 1727204648.97792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204649.00795: done with get_vars() 40074 1727204649.00831: done getting variables 40074 1727204649.00902: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set interface0] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:155 Tuesday 24 September 2024 15:04:09 -0400 (0:00:00.074) 0:00:42.771 ***** 40074 1727204649.00937: entering _queue_task() for managed-node2/set_fact 40074 1727204649.01278: worker is 1 (out of 1 available) 40074 1727204649.01297: exiting _queue_task() for managed-node2/set_fact 40074 1727204649.01313: done queuing things up, now waiting for results queue to drain 40074 1727204649.01315: waiting for pending results... 40074 1727204649.01713: running TaskExecutor() for managed-node2/TASK: Set interface0 40074 1727204649.01718: in run() - task 12b410aa-8751-9fd7-2501-0000000000b7 40074 1727204649.01733: variable 'ansible_search_path' from source: unknown 40074 1727204649.01774: calling self._execute() 40074 1727204649.01943: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204649.01947: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204649.01951: variable 'omit' from source: magic vars 40074 1727204649.02413: variable 'ansible_distribution_major_version' from source: facts 40074 1727204649.02430: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204649.02438: variable 'omit' from source: magic vars 40074 1727204649.02495: variable 'omit' from source: magic vars 40074 1727204649.02530: variable 'interface0' from source: play vars 40074 1727204649.02708: variable 'interface0' from source: play vars 40074 1727204649.02712: variable 'omit' from source: magic vars 40074 1727204649.02714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204649.02752: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204649.02775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204649.02799: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204649.02823: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204649.02858: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204649.02862: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204649.02865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204649.03014: Set connection var ansible_pipelining to False 40074 1727204649.03025: Set connection var ansible_shell_executable to /bin/sh 40074 1727204649.03195: Set connection var ansible_shell_type to sh 40074 1727204649.03198: Set connection var ansible_connection to ssh 40074 1727204649.03201: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204649.03203: Set connection var ansible_timeout to 10 40074 1727204649.03206: variable 'ansible_shell_executable' from source: unknown 40074 1727204649.03208: variable 'ansible_connection' from source: unknown 40074 1727204649.03210: variable 'ansible_module_compression' from source: unknown 40074 1727204649.03212: variable 'ansible_shell_type' from source: unknown 40074 1727204649.03215: variable 'ansible_shell_executable' from source: unknown 40074 1727204649.03217: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204649.03219: variable 'ansible_pipelining' from source: unknown 40074 1727204649.03221: variable 'ansible_timeout' from source: unknown 40074 1727204649.03223: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204649.03313: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204649.03331: variable 'omit' from source: magic vars 40074 1727204649.03340: starting attempt loop 40074 1727204649.03343: running the handler 40074 1727204649.03356: handler run complete 40074 1727204649.03377: attempt loop complete, returning result 40074 1727204649.03380: _execute() done 40074 1727204649.03383: dumping result to json 40074 1727204649.03388: done dumping result, returning 40074 1727204649.03401: done running TaskExecutor() for managed-node2/TASK: Set interface0 [12b410aa-8751-9fd7-2501-0000000000b7] 40074 1727204649.03404: sending task result for task 12b410aa-8751-9fd7-2501-0000000000b7 ok: [managed-node2] => { "ansible_facts": { "interface": "ethtest0" }, "changed": false } 40074 1727204649.03647: no more pending results, returning what we have 40074 1727204649.03650: results queue empty 40074 1727204649.03651: checking for any_errors_fatal 40074 1727204649.03657: done checking for any_errors_fatal 40074 1727204649.03658: checking for max_fail_percentage 40074 1727204649.03660: done checking for max_fail_percentage 40074 1727204649.03660: checking to see if all hosts have failed and the running result is not ok 40074 1727204649.03662: done checking to see if all hosts have failed 40074 1727204649.03662: getting the remaining hosts for this loop 40074 1727204649.03664: done getting the remaining hosts for this loop 40074 1727204649.03667: getting the next task for host managed-node2 40074 1727204649.03674: done getting next task for host managed-node2 40074 1727204649.03677: ^ task is: TASK: Delete interface0 40074 1727204649.03680: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204649.03684: getting variables 40074 1727204649.03686: in VariableManager get_vars() 40074 1727204649.03726: Calling all_inventory to load vars for managed-node2 40074 1727204649.03729: Calling groups_inventory to load vars for managed-node2 40074 1727204649.03732: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204649.03742: Calling all_plugins_play to load vars for managed-node2 40074 1727204649.03745: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204649.03748: Calling groups_plugins_play to load vars for managed-node2 40074 1727204649.04307: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000b7 40074 1727204649.04310: WORKER PROCESS EXITING 40074 1727204649.06116: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204649.09102: done with get_vars() 40074 1727204649.09139: done getting variables TASK [Delete interface0] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:158 Tuesday 24 September 2024 15:04:09 -0400 (0:00:00.083) 0:00:42.854 ***** 40074 1727204649.09254: entering _queue_task() for managed-node2/include_tasks 40074 1727204649.09612: worker is 1 (out of 1 available) 40074 1727204649.09628: exiting _queue_task() for managed-node2/include_tasks 40074 1727204649.09641: done queuing things up, now waiting for results queue to drain 40074 1727204649.09643: waiting for pending results... 40074 1727204649.09966: running TaskExecutor() for managed-node2/TASK: Delete interface0 40074 1727204649.10106: in run() - task 12b410aa-8751-9fd7-2501-0000000000b8 40074 1727204649.10135: variable 'ansible_search_path' from source: unknown 40074 1727204649.10180: calling self._execute() 40074 1727204649.10305: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204649.10329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204649.10350: variable 'omit' from source: magic vars 40074 1727204649.10833: variable 'ansible_distribution_major_version' from source: facts 40074 1727204649.10851: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204649.10863: _execute() done 40074 1727204649.10872: dumping result to json 40074 1727204649.10887: done dumping result, returning 40074 1727204649.10900: done running TaskExecutor() for managed-node2/TASK: Delete interface0 [12b410aa-8751-9fd7-2501-0000000000b8] 40074 1727204649.10909: sending task result for task 12b410aa-8751-9fd7-2501-0000000000b8 40074 1727204649.11050: no more pending results, returning what we have 40074 1727204649.11056: in VariableManager get_vars() 40074 1727204649.11116: Calling all_inventory to load vars for managed-node2 40074 1727204649.11119: Calling groups_inventory to load vars for managed-node2 40074 1727204649.11123: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204649.11140: Calling all_plugins_play to load vars for managed-node2 40074 1727204649.11144: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204649.11149: Calling groups_plugins_play to load vars for managed-node2 40074 1727204649.12206: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000b8 40074 1727204649.12211: WORKER PROCESS EXITING 40074 1727204649.13657: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204649.16774: done with get_vars() 40074 1727204649.16808: variable 'ansible_search_path' from source: unknown 40074 1727204649.16826: we have included files to process 40074 1727204649.16827: generating all_blocks data 40074 1727204649.16829: done generating all_blocks data 40074 1727204649.16836: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 40074 1727204649.16837: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 40074 1727204649.16840: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 40074 1727204649.17060: done processing included file 40074 1727204649.17063: iterating over new_blocks loaded from include file 40074 1727204649.17064: in VariableManager get_vars() 40074 1727204649.17092: done with get_vars() 40074 1727204649.17095: filtering new block on tags 40074 1727204649.17128: done filtering new block on tags 40074 1727204649.17131: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node2 40074 1727204649.17137: extending task lists for all hosts with included blocks 40074 1727204649.19084: done extending task lists 40074 1727204649.19086: done processing included files 40074 1727204649.19087: results queue empty 40074 1727204649.19088: checking for any_errors_fatal 40074 1727204649.19095: done checking for any_errors_fatal 40074 1727204649.19096: checking for max_fail_percentage 40074 1727204649.19097: done checking for max_fail_percentage 40074 1727204649.19099: checking to see if all hosts have failed and the running result is not ok 40074 1727204649.19100: done checking to see if all hosts have failed 40074 1727204649.19101: getting the remaining hosts for this loop 40074 1727204649.19102: done getting the remaining hosts for this loop 40074 1727204649.19105: getting the next task for host managed-node2 40074 1727204649.19111: done getting next task for host managed-node2 40074 1727204649.19113: ^ task is: TASK: Remove test interface if necessary 40074 1727204649.19117: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204649.19120: getting variables 40074 1727204649.19122: in VariableManager get_vars() 40074 1727204649.19141: Calling all_inventory to load vars for managed-node2 40074 1727204649.19145: Calling groups_inventory to load vars for managed-node2 40074 1727204649.19148: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204649.19156: Calling all_plugins_play to load vars for managed-node2 40074 1727204649.19159: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204649.19163: Calling groups_plugins_play to load vars for managed-node2 40074 1727204649.21253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204649.24255: done with get_vars() 40074 1727204649.24301: done getting variables 40074 1727204649.24363: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 15:04:09 -0400 (0:00:00.151) 0:00:43.005 ***** 40074 1727204649.24404: entering _queue_task() for managed-node2/command 40074 1727204649.24794: worker is 1 (out of 1 available) 40074 1727204649.24809: exiting _queue_task() for managed-node2/command 40074 1727204649.24824: done queuing things up, now waiting for results queue to drain 40074 1727204649.24826: waiting for pending results... 40074 1727204649.25139: running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary 40074 1727204649.25280: in run() - task 12b410aa-8751-9fd7-2501-0000000008da 40074 1727204649.25305: variable 'ansible_search_path' from source: unknown 40074 1727204649.25312: variable 'ansible_search_path' from source: unknown 40074 1727204649.25357: calling self._execute() 40074 1727204649.25476: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204649.25495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204649.25515: variable 'omit' from source: magic vars 40074 1727204649.25984: variable 'ansible_distribution_major_version' from source: facts 40074 1727204649.26004: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204649.26020: variable 'omit' from source: magic vars 40074 1727204649.26092: variable 'omit' from source: magic vars 40074 1727204649.26221: variable 'interface' from source: set_fact 40074 1727204649.26254: variable 'omit' from source: magic vars 40074 1727204649.26307: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204649.26362: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204649.26396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204649.26425: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204649.26447: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204649.26492: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204649.26502: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204649.26512: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204649.26650: Set connection var ansible_pipelining to False 40074 1727204649.26664: Set connection var ansible_shell_executable to /bin/sh 40074 1727204649.26678: Set connection var ansible_shell_type to sh 40074 1727204649.26686: Set connection var ansible_connection to ssh 40074 1727204649.26783: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204649.26787: Set connection var ansible_timeout to 10 40074 1727204649.26791: variable 'ansible_shell_executable' from source: unknown 40074 1727204649.26794: variable 'ansible_connection' from source: unknown 40074 1727204649.26796: variable 'ansible_module_compression' from source: unknown 40074 1727204649.26799: variable 'ansible_shell_type' from source: unknown 40074 1727204649.26801: variable 'ansible_shell_executable' from source: unknown 40074 1727204649.26803: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204649.26805: variable 'ansible_pipelining' from source: unknown 40074 1727204649.26808: variable 'ansible_timeout' from source: unknown 40074 1727204649.26810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204649.26968: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204649.26988: variable 'omit' from source: magic vars 40074 1727204649.27006: starting attempt loop 40074 1727204649.27014: running the handler 40074 1727204649.27038: _low_level_execute_command(): starting 40074 1727204649.27052: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204649.27813: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204649.27831: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204649.27846: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204649.27881: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204649.27902: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204649.27997: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204649.28021: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204649.28043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204649.28069: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204649.28145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204649.29974: stdout chunk (state=3): >>>/root <<< 40074 1727204649.30192: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204649.30196: stdout chunk (state=3): >>><<< 40074 1727204649.30199: stderr chunk (state=3): >>><<< 40074 1727204649.30225: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204649.30341: _low_level_execute_command(): starting 40074 1727204649.30345: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195 `" && echo ansible-tmp-1727204649.302334-41958-136353413520195="` echo /root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195 `" ) && sleep 0' 40074 1727204649.30966: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204649.30983: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204649.31019: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204649.31154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204649.31200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204649.31236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204649.33347: stdout chunk (state=3): >>>ansible-tmp-1727204649.302334-41958-136353413520195=/root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195 <<< 40074 1727204649.33495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204649.33578: stderr chunk (state=3): >>><<< 40074 1727204649.33588: stdout chunk (state=3): >>><<< 40074 1727204649.33795: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204649.302334-41958-136353413520195=/root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204649.33799: variable 'ansible_module_compression' from source: unknown 40074 1727204649.33801: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204649.33804: variable 'ansible_facts' from source: unknown 40074 1727204649.33861: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195/AnsiballZ_command.py 40074 1727204649.34052: Sending initial data 40074 1727204649.34061: Sent initial data (155 bytes) 40074 1727204649.34692: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204649.34797: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204649.34846: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204649.34886: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204649.36621: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204649.36664: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204649.36704: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpyxi4nyiu /root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195/AnsiballZ_command.py <<< 40074 1727204649.36711: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195/AnsiballZ_command.py" <<< 40074 1727204649.36738: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpyxi4nyiu" to remote "/root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195/AnsiballZ_command.py" <<< 40074 1727204649.37530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204649.37593: stderr chunk (state=3): >>><<< 40074 1727204649.37597: stdout chunk (state=3): >>><<< 40074 1727204649.37615: done transferring module to remote 40074 1727204649.37627: _low_level_execute_command(): starting 40074 1727204649.37633: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195/ /root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195/AnsiballZ_command.py && sleep 0' 40074 1727204649.38061: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204649.38096: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204649.38100: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204649.38104: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 40074 1727204649.38107: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204649.38109: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204649.38164: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204649.38168: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204649.38210: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204649.40176: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204649.40204: stderr chunk (state=3): >>><<< 40074 1727204649.40208: stdout chunk (state=3): >>><<< 40074 1727204649.40229: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204649.40233: _low_level_execute_command(): starting 40074 1727204649.40240: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195/AnsiballZ_command.py && sleep 0' 40074 1727204649.40884: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204649.40923: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204649.40927: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204649.40930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204649.40999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204649.41003: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204649.41006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204649.41008: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204649.41010: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204649.41013: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204649.41066: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204649.41103: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204649.41119: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204649.41127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204649.41216: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204649.59831: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:04:09.588329", "end": "2024-09-24 15:04:09.596976", "delta": "0:00:00.008647", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}}<<< 40074 1727204649.59908: stdout chunk (state=3): >>> <<< 40074 1727204649.62037: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204649.62102: stderr chunk (state=3): >>><<< 40074 1727204649.62105: stdout chunk (state=3): >>><<< 40074 1727204649.62133: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "ethtest0"], "start": "2024-09-24 15:04:09.588329", "end": "2024-09-24 15:04:09.596976", "delta": "0:00:00.008647", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del ethtest0", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204649.62168: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204649.62177: _low_level_execute_command(): starting 40074 1727204649.62183: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204649.302334-41958-136353413520195/ > /dev/null 2>&1 && sleep 0' 40074 1727204649.62650: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204649.62654: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204649.62693: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204649.62703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204649.62706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204649.62756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204649.62759: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204649.62809: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204649.64807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204649.64859: stderr chunk (state=3): >>><<< 40074 1727204649.64862: stdout chunk (state=3): >>><<< 40074 1727204649.64878: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204649.64887: handler run complete 40074 1727204649.64911: Evaluated conditional (False): False 40074 1727204649.64924: attempt loop complete, returning result 40074 1727204649.64927: _execute() done 40074 1727204649.64930: dumping result to json 40074 1727204649.64937: done dumping result, returning 40074 1727204649.64948: done running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary [12b410aa-8751-9fd7-2501-0000000008da] 40074 1727204649.64952: sending task result for task 12b410aa-8751-9fd7-2501-0000000008da 40074 1727204649.65059: done sending task result for task 12b410aa-8751-9fd7-2501-0000000008da 40074 1727204649.65062: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "link", "del", "ethtest0" ], "delta": "0:00:00.008647", "end": "2024-09-24 15:04:09.596976", "rc": 0, "start": "2024-09-24 15:04:09.588329" } 40074 1727204649.65143: no more pending results, returning what we have 40074 1727204649.65147: results queue empty 40074 1727204649.65150: checking for any_errors_fatal 40074 1727204649.65151: done checking for any_errors_fatal 40074 1727204649.65152: checking for max_fail_percentage 40074 1727204649.65154: done checking for max_fail_percentage 40074 1727204649.65155: checking to see if all hosts have failed and the running result is not ok 40074 1727204649.65157: done checking to see if all hosts have failed 40074 1727204649.65158: getting the remaining hosts for this loop 40074 1727204649.65159: done getting the remaining hosts for this loop 40074 1727204649.65163: getting the next task for host managed-node2 40074 1727204649.65179: done getting next task for host managed-node2 40074 1727204649.65183: ^ task is: TASK: Assert interface0 is absent 40074 1727204649.65187: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204649.65194: getting variables 40074 1727204649.65196: in VariableManager get_vars() 40074 1727204649.65241: Calling all_inventory to load vars for managed-node2 40074 1727204649.65245: Calling groups_inventory to load vars for managed-node2 40074 1727204649.65247: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204649.65259: Calling all_plugins_play to load vars for managed-node2 40074 1727204649.65262: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204649.65265: Calling groups_plugins_play to load vars for managed-node2 40074 1727204649.66634: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204649.68237: done with get_vars() 40074 1727204649.68260: done getting variables TASK [Assert interface0 is absent] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:160 Tuesday 24 September 2024 15:04:09 -0400 (0:00:00.439) 0:00:43.445 ***** 40074 1727204649.68343: entering _queue_task() for managed-node2/include_tasks 40074 1727204649.68593: worker is 1 (out of 1 available) 40074 1727204649.68607: exiting _queue_task() for managed-node2/include_tasks 40074 1727204649.68624: done queuing things up, now waiting for results queue to drain 40074 1727204649.68626: waiting for pending results... 40074 1727204649.68821: running TaskExecutor() for managed-node2/TASK: Assert interface0 is absent 40074 1727204649.68908: in run() - task 12b410aa-8751-9fd7-2501-0000000000b9 40074 1727204649.68923: variable 'ansible_search_path' from source: unknown 40074 1727204649.68956: calling self._execute() 40074 1727204649.69038: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204649.69046: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204649.69056: variable 'omit' from source: magic vars 40074 1727204649.69384: variable 'ansible_distribution_major_version' from source: facts 40074 1727204649.69397: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204649.69408: _execute() done 40074 1727204649.69413: dumping result to json 40074 1727204649.69416: done dumping result, returning 40074 1727204649.69422: done running TaskExecutor() for managed-node2/TASK: Assert interface0 is absent [12b410aa-8751-9fd7-2501-0000000000b9] 40074 1727204649.69426: sending task result for task 12b410aa-8751-9fd7-2501-0000000000b9 40074 1727204649.69523: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000b9 40074 1727204649.69526: WORKER PROCESS EXITING 40074 1727204649.69558: no more pending results, returning what we have 40074 1727204649.69563: in VariableManager get_vars() 40074 1727204649.69613: Calling all_inventory to load vars for managed-node2 40074 1727204649.69619: Calling groups_inventory to load vars for managed-node2 40074 1727204649.69622: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204649.69634: Calling all_plugins_play to load vars for managed-node2 40074 1727204649.69639: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204649.69643: Calling groups_plugins_play to load vars for managed-node2 40074 1727204649.70972: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204649.72575: done with get_vars() 40074 1727204649.72598: variable 'ansible_search_path' from source: unknown 40074 1727204649.72611: we have included files to process 40074 1727204649.72612: generating all_blocks data 40074 1727204649.72613: done generating all_blocks data 40074 1727204649.72620: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 40074 1727204649.72621: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 40074 1727204649.72623: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 40074 1727204649.72712: in VariableManager get_vars() 40074 1727204649.72735: done with get_vars() 40074 1727204649.72835: done processing included file 40074 1727204649.72838: iterating over new_blocks loaded from include file 40074 1727204649.72839: in VariableManager get_vars() 40074 1727204649.72855: done with get_vars() 40074 1727204649.72856: filtering new block on tags 40074 1727204649.72883: done filtering new block on tags 40074 1727204649.72884: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 40074 1727204649.72890: extending task lists for all hosts with included blocks 40074 1727204649.74750: done extending task lists 40074 1727204649.74752: done processing included files 40074 1727204649.74753: results queue empty 40074 1727204649.74754: checking for any_errors_fatal 40074 1727204649.74760: done checking for any_errors_fatal 40074 1727204649.74762: checking for max_fail_percentage 40074 1727204649.74763: done checking for max_fail_percentage 40074 1727204649.74764: checking to see if all hosts have failed and the running result is not ok 40074 1727204649.74766: done checking to see if all hosts have failed 40074 1727204649.74767: getting the remaining hosts for this loop 40074 1727204649.74768: done getting the remaining hosts for this loop 40074 1727204649.74772: getting the next task for host managed-node2 40074 1727204649.74777: done getting next task for host managed-node2 40074 1727204649.74780: ^ task is: TASK: Include the task 'get_interface_stat.yml' 40074 1727204649.74784: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204649.74788: getting variables 40074 1727204649.74791: in VariableManager get_vars() 40074 1727204649.74813: Calling all_inventory to load vars for managed-node2 40074 1727204649.74819: Calling groups_inventory to load vars for managed-node2 40074 1727204649.74823: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204649.74831: Calling all_plugins_play to load vars for managed-node2 40074 1727204649.74835: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204649.74839: Calling groups_plugins_play to load vars for managed-node2 40074 1727204649.77009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204649.79292: done with get_vars() 40074 1727204649.79336: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 15:04:09 -0400 (0:00:00.110) 0:00:43.556 ***** 40074 1727204649.79434: entering _queue_task() for managed-node2/include_tasks 40074 1727204649.79859: worker is 1 (out of 1 available) 40074 1727204649.79874: exiting _queue_task() for managed-node2/include_tasks 40074 1727204649.79892: done queuing things up, now waiting for results queue to drain 40074 1727204649.79894: waiting for pending results... 40074 1727204649.80232: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 40074 1727204649.80383: in run() - task 12b410aa-8751-9fd7-2501-000000000990 40074 1727204649.80401: variable 'ansible_search_path' from source: unknown 40074 1727204649.80404: variable 'ansible_search_path' from source: unknown 40074 1727204649.80466: calling self._execute() 40074 1727204649.80603: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204649.80607: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204649.80610: variable 'omit' from source: magic vars 40074 1727204649.80953: variable 'ansible_distribution_major_version' from source: facts 40074 1727204649.80963: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204649.80970: _execute() done 40074 1727204649.80976: dumping result to json 40074 1727204649.80981: done dumping result, returning 40074 1727204649.80988: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-9fd7-2501-000000000990] 40074 1727204649.80995: sending task result for task 12b410aa-8751-9fd7-2501-000000000990 40074 1727204649.81088: done sending task result for task 12b410aa-8751-9fd7-2501-000000000990 40074 1727204649.81094: WORKER PROCESS EXITING 40074 1727204649.81127: no more pending results, returning what we have 40074 1727204649.81133: in VariableManager get_vars() 40074 1727204649.81185: Calling all_inventory to load vars for managed-node2 40074 1727204649.81190: Calling groups_inventory to load vars for managed-node2 40074 1727204649.81193: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204649.81210: Calling all_plugins_play to load vars for managed-node2 40074 1727204649.81213: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204649.81217: Calling groups_plugins_play to load vars for managed-node2 40074 1727204649.82656: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204649.85793: done with get_vars() 40074 1727204649.85835: variable 'ansible_search_path' from source: unknown 40074 1727204649.85836: variable 'ansible_search_path' from source: unknown 40074 1727204649.85887: we have included files to process 40074 1727204649.85888: generating all_blocks data 40074 1727204649.85892: done generating all_blocks data 40074 1727204649.85893: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 40074 1727204649.85895: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 40074 1727204649.85897: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 40074 1727204649.86136: done processing included file 40074 1727204649.86139: iterating over new_blocks loaded from include file 40074 1727204649.86141: in VariableManager get_vars() 40074 1727204649.86166: done with get_vars() 40074 1727204649.86168: filtering new block on tags 40074 1727204649.86203: done filtering new block on tags 40074 1727204649.86206: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 40074 1727204649.86212: extending task lists for all hosts with included blocks 40074 1727204649.86372: done extending task lists 40074 1727204649.86374: done processing included files 40074 1727204649.86375: results queue empty 40074 1727204649.86376: checking for any_errors_fatal 40074 1727204649.86379: done checking for any_errors_fatal 40074 1727204649.86380: checking for max_fail_percentage 40074 1727204649.86381: done checking for max_fail_percentage 40074 1727204649.86382: checking to see if all hosts have failed and the running result is not ok 40074 1727204649.86383: done checking to see if all hosts have failed 40074 1727204649.86384: getting the remaining hosts for this loop 40074 1727204649.86386: done getting the remaining hosts for this loop 40074 1727204649.86390: getting the next task for host managed-node2 40074 1727204649.86396: done getting next task for host managed-node2 40074 1727204649.86398: ^ task is: TASK: Get stat for interface {{ interface }} 40074 1727204649.86402: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204649.86405: getting variables 40074 1727204649.86406: in VariableManager get_vars() 40074 1727204649.86423: Calling all_inventory to load vars for managed-node2 40074 1727204649.86426: Calling groups_inventory to load vars for managed-node2 40074 1727204649.86429: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204649.86436: Calling all_plugins_play to load vars for managed-node2 40074 1727204649.86439: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204649.86443: Calling groups_plugins_play to load vars for managed-node2 40074 1727204649.88557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204649.91464: done with get_vars() 40074 1727204649.91509: done getting variables 40074 1727204649.91707: variable 'interface' from source: set_fact TASK [Get stat for interface ethtest0] ***************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 15:04:09 -0400 (0:00:00.123) 0:00:43.679 ***** 40074 1727204649.91745: entering _queue_task() for managed-node2/stat 40074 1727204649.92141: worker is 1 (out of 1 available) 40074 1727204649.92156: exiting _queue_task() for managed-node2/stat 40074 1727204649.92169: done queuing things up, now waiting for results queue to drain 40074 1727204649.92172: waiting for pending results... 40074 1727204649.92616: running TaskExecutor() for managed-node2/TASK: Get stat for interface ethtest0 40074 1727204649.92671: in run() - task 12b410aa-8751-9fd7-2501-000000000a4d 40074 1727204649.92700: variable 'ansible_search_path' from source: unknown 40074 1727204649.92796: variable 'ansible_search_path' from source: unknown 40074 1727204649.92800: calling self._execute() 40074 1727204649.92868: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204649.92881: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204649.92903: variable 'omit' from source: magic vars 40074 1727204649.93353: variable 'ansible_distribution_major_version' from source: facts 40074 1727204649.93371: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204649.93381: variable 'omit' from source: magic vars 40074 1727204649.93465: variable 'omit' from source: magic vars 40074 1727204649.93584: variable 'interface' from source: set_fact 40074 1727204649.93677: variable 'omit' from source: magic vars 40074 1727204649.93681: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204649.93711: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204649.93739: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204649.93765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204649.93787: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204649.93825: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204649.93834: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204649.93843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204649.93979: Set connection var ansible_pipelining to False 40074 1727204649.93993: Set connection var ansible_shell_executable to /bin/sh 40074 1727204649.94002: Set connection var ansible_shell_type to sh 40074 1727204649.94012: Set connection var ansible_connection to ssh 40074 1727204649.94022: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204649.94118: Set connection var ansible_timeout to 10 40074 1727204649.94121: variable 'ansible_shell_executable' from source: unknown 40074 1727204649.94125: variable 'ansible_connection' from source: unknown 40074 1727204649.94127: variable 'ansible_module_compression' from source: unknown 40074 1727204649.94129: variable 'ansible_shell_type' from source: unknown 40074 1727204649.94132: variable 'ansible_shell_executable' from source: unknown 40074 1727204649.94134: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204649.94136: variable 'ansible_pipelining' from source: unknown 40074 1727204649.94138: variable 'ansible_timeout' from source: unknown 40074 1727204649.94140: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204649.94364: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204649.94383: variable 'omit' from source: magic vars 40074 1727204649.94399: starting attempt loop 40074 1727204649.94407: running the handler 40074 1727204649.94429: _low_level_execute_command(): starting 40074 1727204649.94448: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204649.95319: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204649.95324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204649.95347: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204649.95364: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204649.95392: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204649.95469: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204649.97270: stdout chunk (state=3): >>>/root <<< 40074 1727204649.97473: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204649.97477: stdout chunk (state=3): >>><<< 40074 1727204649.97480: stderr chunk (state=3): >>><<< 40074 1727204649.97502: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204649.97524: _low_level_execute_command(): starting 40074 1727204649.97536: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933 `" && echo ansible-tmp-1727204649.9751012-41981-266711644505933="` echo /root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933 `" ) && sleep 0' 40074 1727204649.98192: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204649.98195: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204649.98198: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204649.98207: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204649.98210: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204649.98307: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204649.98311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204649.98356: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204650.00468: stdout chunk (state=3): >>>ansible-tmp-1727204649.9751012-41981-266711644505933=/root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933 <<< 40074 1727204650.00687: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204650.00693: stdout chunk (state=3): >>><<< 40074 1727204650.00695: stderr chunk (state=3): >>><<< 40074 1727204650.00894: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204649.9751012-41981-266711644505933=/root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204650.00899: variable 'ansible_module_compression' from source: unknown 40074 1727204650.00901: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 40074 1727204650.00903: variable 'ansible_facts' from source: unknown 40074 1727204650.00966: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933/AnsiballZ_stat.py 40074 1727204650.01152: Sending initial data 40074 1727204650.01163: Sent initial data (153 bytes) 40074 1727204650.01876: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204650.01908: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204650.01924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204650.02028: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204650.02073: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204650.02147: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204650.03904: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204650.03972: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204650.04021: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933/AnsiballZ_stat.py" <<< 40074 1727204650.04059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp17k_t00j /root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933/AnsiballZ_stat.py <<< 40074 1727204650.04080: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp17k_t00j" to remote "/root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933/AnsiballZ_stat.py" <<< 40074 1727204650.05142: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204650.05267: stderr chunk (state=3): >>><<< 40074 1727204650.05286: stdout chunk (state=3): >>><<< 40074 1727204650.05321: done transferring module to remote 40074 1727204650.05339: _low_level_execute_command(): starting 40074 1727204650.05353: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933/ /root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933/AnsiballZ_stat.py && sleep 0' 40074 1727204650.06048: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204650.06064: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204650.06107: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204650.06152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204650.06164: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204650.06246: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204650.06302: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204650.06349: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204650.08465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204650.08468: stdout chunk (state=3): >>><<< 40074 1727204650.08471: stderr chunk (state=3): >>><<< 40074 1727204650.08488: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204650.08585: _low_level_execute_command(): starting 40074 1727204650.08591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933/AnsiballZ_stat.py && sleep 0' 40074 1727204650.09206: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204650.09268: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204650.09287: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204650.09312: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204650.09393: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204650.26989: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 40074 1727204650.28513: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204650.28520: stdout chunk (state=3): >>><<< 40074 1727204650.28523: stderr chunk (state=3): >>><<< 40074 1727204650.28690: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204650.28695: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204650.28697: _low_level_execute_command(): starting 40074 1727204650.28700: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204649.9751012-41981-266711644505933/ > /dev/null 2>&1 && sleep 0' 40074 1727204650.29268: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204650.29299: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204650.29312: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204650.29339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204650.29424: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204650.31509: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204650.31554: stdout chunk (state=3): >>><<< 40074 1727204650.31557: stderr chunk (state=3): >>><<< 40074 1727204650.31575: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204650.31694: handler run complete 40074 1727204650.31698: attempt loop complete, returning result 40074 1727204650.31701: _execute() done 40074 1727204650.31703: dumping result to json 40074 1727204650.31705: done dumping result, returning 40074 1727204650.31708: done running TaskExecutor() for managed-node2/TASK: Get stat for interface ethtest0 [12b410aa-8751-9fd7-2501-000000000a4d] 40074 1727204650.31710: sending task result for task 12b410aa-8751-9fd7-2501-000000000a4d ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 40074 1727204650.32115: no more pending results, returning what we have 40074 1727204650.32121: results queue empty 40074 1727204650.32123: checking for any_errors_fatal 40074 1727204650.32125: done checking for any_errors_fatal 40074 1727204650.32126: checking for max_fail_percentage 40074 1727204650.32128: done checking for max_fail_percentage 40074 1727204650.32129: checking to see if all hosts have failed and the running result is not ok 40074 1727204650.32130: done checking to see if all hosts have failed 40074 1727204650.32131: getting the remaining hosts for this loop 40074 1727204650.32133: done getting the remaining hosts for this loop 40074 1727204650.32137: getting the next task for host managed-node2 40074 1727204650.32148: done getting next task for host managed-node2 40074 1727204650.32152: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 40074 1727204650.32156: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204650.32161: getting variables 40074 1727204650.32163: in VariableManager get_vars() 40074 1727204650.32224: Calling all_inventory to load vars for managed-node2 40074 1727204650.32228: Calling groups_inventory to load vars for managed-node2 40074 1727204650.32231: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204650.32240: done sending task result for task 12b410aa-8751-9fd7-2501-000000000a4d 40074 1727204650.32243: WORKER PROCESS EXITING 40074 1727204650.32255: Calling all_plugins_play to load vars for managed-node2 40074 1727204650.32259: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204650.32263: Calling groups_plugins_play to load vars for managed-node2 40074 1727204650.34881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204650.38346: done with get_vars() 40074 1727204650.38393: done getting variables 40074 1727204650.38476: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204650.38632: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'ethtest0'] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 15:04:10 -0400 (0:00:00.469) 0:00:44.148 ***** 40074 1727204650.38675: entering _queue_task() for managed-node2/assert 40074 1727204650.39225: worker is 1 (out of 1 available) 40074 1727204650.39238: exiting _queue_task() for managed-node2/assert 40074 1727204650.39250: done queuing things up, now waiting for results queue to drain 40074 1727204650.39251: waiting for pending results... 40074 1727204650.39475: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'ethtest0' 40074 1727204650.39648: in run() - task 12b410aa-8751-9fd7-2501-000000000991 40074 1727204650.39670: variable 'ansible_search_path' from source: unknown 40074 1727204650.39680: variable 'ansible_search_path' from source: unknown 40074 1727204650.39742: calling self._execute() 40074 1727204650.39863: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204650.39877: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204650.39899: variable 'omit' from source: magic vars 40074 1727204650.40397: variable 'ansible_distribution_major_version' from source: facts 40074 1727204650.40415: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204650.40475: variable 'omit' from source: magic vars 40074 1727204650.40506: variable 'omit' from source: magic vars 40074 1727204650.40644: variable 'interface' from source: set_fact 40074 1727204650.40671: variable 'omit' from source: magic vars 40074 1727204650.40800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204650.40805: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204650.40815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204650.40845: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204650.40864: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204650.40915: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204650.40931: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204650.40940: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204650.41130: Set connection var ansible_pipelining to False 40074 1727204650.41133: Set connection var ansible_shell_executable to /bin/sh 40074 1727204650.41138: Set connection var ansible_shell_type to sh 40074 1727204650.41144: Set connection var ansible_connection to ssh 40074 1727204650.41146: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204650.41149: Set connection var ansible_timeout to 10 40074 1727204650.41186: variable 'ansible_shell_executable' from source: unknown 40074 1727204650.41200: variable 'ansible_connection' from source: unknown 40074 1727204650.41237: variable 'ansible_module_compression' from source: unknown 40074 1727204650.41241: variable 'ansible_shell_type' from source: unknown 40074 1727204650.41243: variable 'ansible_shell_executable' from source: unknown 40074 1727204650.41246: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204650.41254: variable 'ansible_pipelining' from source: unknown 40074 1727204650.41258: variable 'ansible_timeout' from source: unknown 40074 1727204650.41269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204650.41473: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204650.41479: variable 'omit' from source: magic vars 40074 1727204650.41492: starting attempt loop 40074 1727204650.41565: running the handler 40074 1727204650.41721: variable 'interface_stat' from source: set_fact 40074 1727204650.41739: Evaluated conditional (not interface_stat.stat.exists): True 40074 1727204650.41753: handler run complete 40074 1727204650.41785: attempt loop complete, returning result 40074 1727204650.41798: _execute() done 40074 1727204650.41808: dumping result to json 40074 1727204650.41820: done dumping result, returning 40074 1727204650.41834: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'ethtest0' [12b410aa-8751-9fd7-2501-000000000991] 40074 1727204650.41845: sending task result for task 12b410aa-8751-9fd7-2501-000000000991 ok: [managed-node2] => { "changed": false } MSG: All assertions passed 40074 1727204650.42155: no more pending results, returning what we have 40074 1727204650.42159: results queue empty 40074 1727204650.42160: checking for any_errors_fatal 40074 1727204650.42174: done checking for any_errors_fatal 40074 1727204650.42175: checking for max_fail_percentage 40074 1727204650.42177: done checking for max_fail_percentage 40074 1727204650.42178: checking to see if all hosts have failed and the running result is not ok 40074 1727204650.42179: done checking to see if all hosts have failed 40074 1727204650.42180: getting the remaining hosts for this loop 40074 1727204650.42182: done getting the remaining hosts for this loop 40074 1727204650.42187: getting the next task for host managed-node2 40074 1727204650.42200: done getting next task for host managed-node2 40074 1727204650.42212: ^ task is: TASK: Assert interface0 profile and interface1 profile are absent 40074 1727204650.42220: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204650.42226: getting variables 40074 1727204650.42229: in VariableManager get_vars() 40074 1727204650.42279: Calling all_inventory to load vars for managed-node2 40074 1727204650.42283: Calling groups_inventory to load vars for managed-node2 40074 1727204650.42286: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204650.42404: done sending task result for task 12b410aa-8751-9fd7-2501-000000000991 40074 1727204650.42408: WORKER PROCESS EXITING 40074 1727204650.42421: Calling all_plugins_play to load vars for managed-node2 40074 1727204650.42426: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204650.42430: Calling groups_plugins_play to load vars for managed-node2 40074 1727204650.45180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204650.48551: done with get_vars() 40074 1727204650.48597: done getting variables TASK [Assert interface0 profile and interface1 profile are absent] ************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:162 Tuesday 24 September 2024 15:04:10 -0400 (0:00:00.100) 0:00:44.248 ***** 40074 1727204650.48723: entering _queue_task() for managed-node2/include_tasks 40074 1727204650.49122: worker is 1 (out of 1 available) 40074 1727204650.49138: exiting _queue_task() for managed-node2/include_tasks 40074 1727204650.49153: done queuing things up, now waiting for results queue to drain 40074 1727204650.49155: waiting for pending results... 40074 1727204650.49383: running TaskExecutor() for managed-node2/TASK: Assert interface0 profile and interface1 profile are absent 40074 1727204650.49498: in run() - task 12b410aa-8751-9fd7-2501-0000000000ba 40074 1727204650.49598: variable 'ansible_search_path' from source: unknown 40074 1727204650.49604: variable 'interface0' from source: play vars 40074 1727204650.49784: variable 'interface0' from source: play vars 40074 1727204650.49835: variable 'interface1' from source: play vars 40074 1727204650.49876: variable 'interface1' from source: play vars 40074 1727204650.49893: variable 'omit' from source: magic vars 40074 1727204650.50155: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204650.50161: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204650.50164: variable 'omit' from source: magic vars 40074 1727204650.50391: variable 'ansible_distribution_major_version' from source: facts 40074 1727204650.50498: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204650.50504: variable 'item' from source: unknown 40074 1727204650.50507: variable 'item' from source: unknown 40074 1727204650.50839: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204650.50843: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204650.50846: variable 'omit' from source: magic vars 40074 1727204650.50848: variable 'ansible_distribution_major_version' from source: facts 40074 1727204650.50851: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204650.50919: variable 'item' from source: unknown 40074 1727204650.50934: variable 'item' from source: unknown 40074 1727204650.51010: dumping result to json 40074 1727204650.51013: done dumping result, returning 40074 1727204650.51048: done running TaskExecutor() for managed-node2/TASK: Assert interface0 profile and interface1 profile are absent [12b410aa-8751-9fd7-2501-0000000000ba] 40074 1727204650.51056: sending task result for task 12b410aa-8751-9fd7-2501-0000000000ba 40074 1727204650.51144: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000ba 40074 1727204650.51148: WORKER PROCESS EXITING 40074 1727204650.51191: no more pending results, returning what we have 40074 1727204650.51196: in VariableManager get_vars() 40074 1727204650.51320: Calling all_inventory to load vars for managed-node2 40074 1727204650.51324: Calling groups_inventory to load vars for managed-node2 40074 1727204650.51328: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204650.51339: Calling all_plugins_play to load vars for managed-node2 40074 1727204650.51342: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204650.51346: Calling groups_plugins_play to load vars for managed-node2 40074 1727204650.53580: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204650.56383: done with get_vars() 40074 1727204650.56418: variable 'ansible_search_path' from source: unknown 40074 1727204650.56437: variable 'ansible_search_path' from source: unknown 40074 1727204650.56444: we have included files to process 40074 1727204650.56445: generating all_blocks data 40074 1727204650.56446: done generating all_blocks data 40074 1727204650.56449: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 40074 1727204650.56450: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 40074 1727204650.56452: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 40074 1727204650.56585: in VariableManager get_vars() 40074 1727204650.56607: done with get_vars() 40074 1727204650.56703: done processing included file 40074 1727204650.56705: iterating over new_blocks loaded from include file 40074 1727204650.56706: in VariableManager get_vars() 40074 1727204650.56721: done with get_vars() 40074 1727204650.56723: filtering new block on tags 40074 1727204650.56751: done filtering new block on tags 40074 1727204650.56753: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 => (item=ethtest0) 40074 1727204650.56758: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 40074 1727204650.56759: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 40074 1727204650.56762: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 40074 1727204650.56826: in VariableManager get_vars() 40074 1727204650.56844: done with get_vars() 40074 1727204650.56919: done processing included file 40074 1727204650.56921: iterating over new_blocks loaded from include file 40074 1727204650.56922: in VariableManager get_vars() 40074 1727204650.56936: done with get_vars() 40074 1727204650.56937: filtering new block on tags 40074 1727204650.56960: done filtering new block on tags 40074 1727204650.56961: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 => (item=ethtest1) 40074 1727204650.56964: extending task lists for all hosts with included blocks 40074 1727204650.58521: done extending task lists 40074 1727204650.58522: done processing included files 40074 1727204650.58523: results queue empty 40074 1727204650.58524: checking for any_errors_fatal 40074 1727204650.58529: done checking for any_errors_fatal 40074 1727204650.58530: checking for max_fail_percentage 40074 1727204650.58531: done checking for max_fail_percentage 40074 1727204650.58532: checking to see if all hosts have failed and the running result is not ok 40074 1727204650.58533: done checking to see if all hosts have failed 40074 1727204650.58534: getting the remaining hosts for this loop 40074 1727204650.58536: done getting the remaining hosts for this loop 40074 1727204650.58538: getting the next task for host managed-node2 40074 1727204650.58543: done getting next task for host managed-node2 40074 1727204650.58546: ^ task is: TASK: Include the task 'get_profile_stat.yml' 40074 1727204650.58549: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204650.58552: getting variables 40074 1727204650.58553: in VariableManager get_vars() 40074 1727204650.58569: Calling all_inventory to load vars for managed-node2 40074 1727204650.58572: Calling groups_inventory to load vars for managed-node2 40074 1727204650.58580: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204650.58587: Calling all_plugins_play to load vars for managed-node2 40074 1727204650.58592: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204650.58596: Calling groups_plugins_play to load vars for managed-node2 40074 1727204650.60463: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204650.66110: done with get_vars() 40074 1727204650.66135: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:04:10 -0400 (0:00:00.174) 0:00:44.423 ***** 40074 1727204650.66202: entering _queue_task() for managed-node2/include_tasks 40074 1727204650.66485: worker is 1 (out of 1 available) 40074 1727204650.66501: exiting _queue_task() for managed-node2/include_tasks 40074 1727204650.66519: done queuing things up, now waiting for results queue to drain 40074 1727204650.66521: waiting for pending results... 40074 1727204650.66721: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 40074 1727204650.66825: in run() - task 12b410aa-8751-9fd7-2501-000000000a6c 40074 1727204650.66842: variable 'ansible_search_path' from source: unknown 40074 1727204650.66846: variable 'ansible_search_path' from source: unknown 40074 1727204650.66880: calling self._execute() 40074 1727204650.66972: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204650.66980: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204650.66993: variable 'omit' from source: magic vars 40074 1727204650.67329: variable 'ansible_distribution_major_version' from source: facts 40074 1727204650.67340: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204650.67347: _execute() done 40074 1727204650.67352: dumping result to json 40074 1727204650.67356: done dumping result, returning 40074 1727204650.67363: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-9fd7-2501-000000000a6c] 40074 1727204650.67369: sending task result for task 12b410aa-8751-9fd7-2501-000000000a6c 40074 1727204650.67464: done sending task result for task 12b410aa-8751-9fd7-2501-000000000a6c 40074 1727204650.67469: WORKER PROCESS EXITING 40074 1727204650.67533: no more pending results, returning what we have 40074 1727204650.67539: in VariableManager get_vars() 40074 1727204650.67590: Calling all_inventory to load vars for managed-node2 40074 1727204650.67593: Calling groups_inventory to load vars for managed-node2 40074 1727204650.67596: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204650.67608: Calling all_plugins_play to load vars for managed-node2 40074 1727204650.67612: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204650.67615: Calling groups_plugins_play to load vars for managed-node2 40074 1727204650.68856: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204650.70464: done with get_vars() 40074 1727204650.70483: variable 'ansible_search_path' from source: unknown 40074 1727204650.70484: variable 'ansible_search_path' from source: unknown 40074 1727204650.70521: we have included files to process 40074 1727204650.70522: generating all_blocks data 40074 1727204650.70524: done generating all_blocks data 40074 1727204650.70525: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 40074 1727204650.70526: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 40074 1727204650.70527: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 40074 1727204650.71420: done processing included file 40074 1727204650.71422: iterating over new_blocks loaded from include file 40074 1727204650.71423: in VariableManager get_vars() 40074 1727204650.71441: done with get_vars() 40074 1727204650.71443: filtering new block on tags 40074 1727204650.71553: done filtering new block on tags 40074 1727204650.71556: in VariableManager get_vars() 40074 1727204650.71571: done with get_vars() 40074 1727204650.71573: filtering new block on tags 40074 1727204650.71624: done filtering new block on tags 40074 1727204650.71626: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 40074 1727204650.71631: extending task lists for all hosts with included blocks 40074 1727204650.71734: done extending task lists 40074 1727204650.71736: done processing included files 40074 1727204650.71736: results queue empty 40074 1727204650.71737: checking for any_errors_fatal 40074 1727204650.71740: done checking for any_errors_fatal 40074 1727204650.71741: checking for max_fail_percentage 40074 1727204650.71742: done checking for max_fail_percentage 40074 1727204650.71742: checking to see if all hosts have failed and the running result is not ok 40074 1727204650.71743: done checking to see if all hosts have failed 40074 1727204650.71743: getting the remaining hosts for this loop 40074 1727204650.71744: done getting the remaining hosts for this loop 40074 1727204650.71746: getting the next task for host managed-node2 40074 1727204650.71749: done getting next task for host managed-node2 40074 1727204650.71751: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 40074 1727204650.71753: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204650.71755: getting variables 40074 1727204650.71756: in VariableManager get_vars() 40074 1727204650.71766: Calling all_inventory to load vars for managed-node2 40074 1727204650.71768: Calling groups_inventory to load vars for managed-node2 40074 1727204650.71770: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204650.71774: Calling all_plugins_play to load vars for managed-node2 40074 1727204650.71776: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204650.71778: Calling groups_plugins_play to load vars for managed-node2 40074 1727204650.72877: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204650.74468: done with get_vars() 40074 1727204650.74488: done getting variables 40074 1727204650.74527: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:04:10 -0400 (0:00:00.083) 0:00:44.507 ***** 40074 1727204650.74556: entering _queue_task() for managed-node2/set_fact 40074 1727204650.74830: worker is 1 (out of 1 available) 40074 1727204650.74845: exiting _queue_task() for managed-node2/set_fact 40074 1727204650.74860: done queuing things up, now waiting for results queue to drain 40074 1727204650.74862: waiting for pending results... 40074 1727204650.75052: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 40074 1727204650.75151: in run() - task 12b410aa-8751-9fd7-2501-000000000b3c 40074 1727204650.75164: variable 'ansible_search_path' from source: unknown 40074 1727204650.75168: variable 'ansible_search_path' from source: unknown 40074 1727204650.75205: calling self._execute() 40074 1727204650.75285: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204650.75293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204650.75308: variable 'omit' from source: magic vars 40074 1727204650.75632: variable 'ansible_distribution_major_version' from source: facts 40074 1727204650.75642: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204650.75649: variable 'omit' from source: magic vars 40074 1727204650.75696: variable 'omit' from source: magic vars 40074 1727204650.75724: variable 'omit' from source: magic vars 40074 1727204650.75762: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204650.75796: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204650.75814: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204650.75832: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204650.75843: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204650.75875: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204650.75880: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204650.75884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204650.75976: Set connection var ansible_pipelining to False 40074 1727204650.75982: Set connection var ansible_shell_executable to /bin/sh 40074 1727204650.75986: Set connection var ansible_shell_type to sh 40074 1727204650.75990: Set connection var ansible_connection to ssh 40074 1727204650.75997: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204650.76004: Set connection var ansible_timeout to 10 40074 1727204650.76027: variable 'ansible_shell_executable' from source: unknown 40074 1727204650.76031: variable 'ansible_connection' from source: unknown 40074 1727204650.76034: variable 'ansible_module_compression' from source: unknown 40074 1727204650.76036: variable 'ansible_shell_type' from source: unknown 40074 1727204650.76041: variable 'ansible_shell_executable' from source: unknown 40074 1727204650.76045: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204650.76050: variable 'ansible_pipelining' from source: unknown 40074 1727204650.76053: variable 'ansible_timeout' from source: unknown 40074 1727204650.76059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204650.76174: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204650.76186: variable 'omit' from source: magic vars 40074 1727204650.76197: starting attempt loop 40074 1727204650.76200: running the handler 40074 1727204650.76213: handler run complete 40074 1727204650.76223: attempt loop complete, returning result 40074 1727204650.76226: _execute() done 40074 1727204650.76229: dumping result to json 40074 1727204650.76234: done dumping result, returning 40074 1727204650.76242: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-9fd7-2501-000000000b3c] 40074 1727204650.76246: sending task result for task 12b410aa-8751-9fd7-2501-000000000b3c 40074 1727204650.76343: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b3c 40074 1727204650.76346: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 40074 1727204650.76414: no more pending results, returning what we have 40074 1727204650.76419: results queue empty 40074 1727204650.76421: checking for any_errors_fatal 40074 1727204650.76422: done checking for any_errors_fatal 40074 1727204650.76423: checking for max_fail_percentage 40074 1727204650.76425: done checking for max_fail_percentage 40074 1727204650.76426: checking to see if all hosts have failed and the running result is not ok 40074 1727204650.76427: done checking to see if all hosts have failed 40074 1727204650.76428: getting the remaining hosts for this loop 40074 1727204650.76429: done getting the remaining hosts for this loop 40074 1727204650.76434: getting the next task for host managed-node2 40074 1727204650.76442: done getting next task for host managed-node2 40074 1727204650.76445: ^ task is: TASK: Stat profile file 40074 1727204650.76450: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204650.76454: getting variables 40074 1727204650.76455: in VariableManager get_vars() 40074 1727204650.76495: Calling all_inventory to load vars for managed-node2 40074 1727204650.76499: Calling groups_inventory to load vars for managed-node2 40074 1727204650.76501: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204650.76511: Calling all_plugins_play to load vars for managed-node2 40074 1727204650.76514: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204650.76520: Calling groups_plugins_play to load vars for managed-node2 40074 1727204650.77738: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204650.80648: done with get_vars() 40074 1727204650.80686: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:04:10 -0400 (0:00:00.062) 0:00:44.569 ***** 40074 1727204650.80820: entering _queue_task() for managed-node2/stat 40074 1727204650.81140: worker is 1 (out of 1 available) 40074 1727204650.81156: exiting _queue_task() for managed-node2/stat 40074 1727204650.81172: done queuing things up, now waiting for results queue to drain 40074 1727204650.81173: waiting for pending results... 40074 1727204650.81524: running TaskExecutor() for managed-node2/TASK: Stat profile file 40074 1727204650.81727: in run() - task 12b410aa-8751-9fd7-2501-000000000b3d 40074 1727204650.81732: variable 'ansible_search_path' from source: unknown 40074 1727204650.81735: variable 'ansible_search_path' from source: unknown 40074 1727204650.81738: calling self._execute() 40074 1727204650.81837: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204650.81854: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204650.81874: variable 'omit' from source: magic vars 40074 1727204650.82373: variable 'ansible_distribution_major_version' from source: facts 40074 1727204650.82403: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204650.82421: variable 'omit' from source: magic vars 40074 1727204650.82504: variable 'omit' from source: magic vars 40074 1727204650.82638: variable 'profile' from source: include params 40074 1727204650.82651: variable 'item' from source: include params 40074 1727204650.82760: variable 'item' from source: include params 40074 1727204650.82819: variable 'omit' from source: magic vars 40074 1727204650.82853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204650.82905: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204650.82995: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204650.82999: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204650.83002: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204650.83038: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204650.83049: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204650.83053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204650.83149: Set connection var ansible_pipelining to False 40074 1727204650.83154: Set connection var ansible_shell_executable to /bin/sh 40074 1727204650.83158: Set connection var ansible_shell_type to sh 40074 1727204650.83160: Set connection var ansible_connection to ssh 40074 1727204650.83170: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204650.83176: Set connection var ansible_timeout to 10 40074 1727204650.83201: variable 'ansible_shell_executable' from source: unknown 40074 1727204650.83205: variable 'ansible_connection' from source: unknown 40074 1727204650.83207: variable 'ansible_module_compression' from source: unknown 40074 1727204650.83210: variable 'ansible_shell_type' from source: unknown 40074 1727204650.83215: variable 'ansible_shell_executable' from source: unknown 40074 1727204650.83217: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204650.83225: variable 'ansible_pipelining' from source: unknown 40074 1727204650.83228: variable 'ansible_timeout' from source: unknown 40074 1727204650.83233: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204650.83410: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204650.83423: variable 'omit' from source: magic vars 40074 1727204650.83430: starting attempt loop 40074 1727204650.83433: running the handler 40074 1727204650.83446: _low_level_execute_command(): starting 40074 1727204650.83454: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204650.83971: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204650.83984: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204650.84011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204650.84015: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204650.84064: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204650.84080: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204650.84127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204650.85898: stdout chunk (state=3): >>>/root <<< 40074 1727204650.86004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204650.86060: stderr chunk (state=3): >>><<< 40074 1727204650.86063: stdout chunk (state=3): >>><<< 40074 1727204650.86092: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204650.86106: _low_level_execute_command(): starting 40074 1727204650.86112: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403 `" && echo ansible-tmp-1727204650.86092-42006-150353958689403="` echo /root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403 `" ) && sleep 0' 40074 1727204650.86556: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204650.86593: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204650.86598: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204650.86610: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204650.86612: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204650.86615: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204650.86662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204650.86665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204650.86711: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204650.88725: stdout chunk (state=3): >>>ansible-tmp-1727204650.86092-42006-150353958689403=/root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403 <<< 40074 1727204650.88840: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204650.88900: stderr chunk (state=3): >>><<< 40074 1727204650.88903: stdout chunk (state=3): >>><<< 40074 1727204650.88922: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204650.86092-42006-150353958689403=/root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204650.88965: variable 'ansible_module_compression' from source: unknown 40074 1727204650.89021: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 40074 1727204650.89052: variable 'ansible_facts' from source: unknown 40074 1727204650.89122: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403/AnsiballZ_stat.py 40074 1727204650.89243: Sending initial data 40074 1727204650.89246: Sent initial data (151 bytes) 40074 1727204650.89693: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204650.89703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204650.89732: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204650.89736: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204650.89739: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204650.89799: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204650.89802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204650.89848: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204650.91521: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 40074 1727204650.91525: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204650.91552: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204650.91590: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpl9p9zhph /root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403/AnsiballZ_stat.py <<< 40074 1727204650.91597: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403/AnsiballZ_stat.py" <<< 40074 1727204650.91628: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpl9p9zhph" to remote "/root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403/AnsiballZ_stat.py" <<< 40074 1727204650.92397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204650.92461: stderr chunk (state=3): >>><<< 40074 1727204650.92465: stdout chunk (state=3): >>><<< 40074 1727204650.92485: done transferring module to remote 40074 1727204650.92498: _low_level_execute_command(): starting 40074 1727204650.92503: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403/ /root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403/AnsiballZ_stat.py && sleep 0' 40074 1727204650.92972: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204650.92975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204650.92979: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204650.92982: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204650.93036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204650.93042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204650.93079: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204650.94966: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204650.95022: stderr chunk (state=3): >>><<< 40074 1727204650.95026: stdout chunk (state=3): >>><<< 40074 1727204650.95042: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204650.95045: _low_level_execute_command(): starting 40074 1727204650.95051: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403/AnsiballZ_stat.py && sleep 0' 40074 1727204650.95486: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204650.95523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204650.95526: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204650.95529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 40074 1727204650.95531: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204650.95533: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204650.95579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204650.95599: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204650.95651: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204651.13271: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 40074 1727204651.14906: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204651.14910: stdout chunk (state=3): >>><<< 40074 1727204651.14913: stderr chunk (state=3): >>><<< 40074 1727204651.14934: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204651.14979: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204651.15078: _low_level_execute_command(): starting 40074 1727204651.15081: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204650.86092-42006-150353958689403/ > /dev/null 2>&1 && sleep 0' 40074 1727204651.15629: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204651.15645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204651.15660: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204651.15680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204651.15701: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204651.15714: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204651.15731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204651.15753: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204651.15813: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204651.15872: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204651.15893: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204651.15915: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204651.15988: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204651.18195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204651.18199: stdout chunk (state=3): >>><<< 40074 1727204651.18202: stderr chunk (state=3): >>><<< 40074 1727204651.18204: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204651.18207: handler run complete 40074 1727204651.18209: attempt loop complete, returning result 40074 1727204651.18211: _execute() done 40074 1727204651.18213: dumping result to json 40074 1727204651.18215: done dumping result, returning 40074 1727204651.18217: done running TaskExecutor() for managed-node2/TASK: Stat profile file [12b410aa-8751-9fd7-2501-000000000b3d] 40074 1727204651.18219: sending task result for task 12b410aa-8751-9fd7-2501-000000000b3d ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 40074 1727204651.18405: no more pending results, returning what we have 40074 1727204651.18409: results queue empty 40074 1727204651.18411: checking for any_errors_fatal 40074 1727204651.18419: done checking for any_errors_fatal 40074 1727204651.18420: checking for max_fail_percentage 40074 1727204651.18422: done checking for max_fail_percentage 40074 1727204651.18423: checking to see if all hosts have failed and the running result is not ok 40074 1727204651.18424: done checking to see if all hosts have failed 40074 1727204651.18425: getting the remaining hosts for this loop 40074 1727204651.18427: done getting the remaining hosts for this loop 40074 1727204651.18432: getting the next task for host managed-node2 40074 1727204651.18442: done getting next task for host managed-node2 40074 1727204651.18445: ^ task is: TASK: Set NM profile exist flag based on the profile files 40074 1727204651.18451: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204651.18455: getting variables 40074 1727204651.18457: in VariableManager get_vars() 40074 1727204651.18623: Calling all_inventory to load vars for managed-node2 40074 1727204651.18628: Calling groups_inventory to load vars for managed-node2 40074 1727204651.18631: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204651.18639: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b3d 40074 1727204651.18643: WORKER PROCESS EXITING 40074 1727204651.18656: Calling all_plugins_play to load vars for managed-node2 40074 1727204651.18660: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204651.18664: Calling groups_plugins_play to load vars for managed-node2 40074 1727204651.21223: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204651.24338: done with get_vars() 40074 1727204651.24383: done getting variables 40074 1727204651.24462: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.436) 0:00:45.006 ***** 40074 1727204651.24505: entering _queue_task() for managed-node2/set_fact 40074 1727204651.25102: worker is 1 (out of 1 available) 40074 1727204651.25115: exiting _queue_task() for managed-node2/set_fact 40074 1727204651.25128: done queuing things up, now waiting for results queue to drain 40074 1727204651.25130: waiting for pending results... 40074 1727204651.25248: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 40074 1727204651.25422: in run() - task 12b410aa-8751-9fd7-2501-000000000b3e 40074 1727204651.25465: variable 'ansible_search_path' from source: unknown 40074 1727204651.25471: variable 'ansible_search_path' from source: unknown 40074 1727204651.25508: calling self._execute() 40074 1727204651.25684: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204651.25687: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204651.25692: variable 'omit' from source: magic vars 40074 1727204651.26128: variable 'ansible_distribution_major_version' from source: facts 40074 1727204651.26148: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204651.26316: variable 'profile_stat' from source: set_fact 40074 1727204651.26344: Evaluated conditional (profile_stat.stat.exists): False 40074 1727204651.26393: when evaluation is False, skipping this task 40074 1727204651.26397: _execute() done 40074 1727204651.26399: dumping result to json 40074 1727204651.26402: done dumping result, returning 40074 1727204651.26405: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-9fd7-2501-000000000b3e] 40074 1727204651.26408: sending task result for task 12b410aa-8751-9fd7-2501-000000000b3e skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 40074 1727204651.26601: no more pending results, returning what we have 40074 1727204651.26606: results queue empty 40074 1727204651.26607: checking for any_errors_fatal 40074 1727204651.26618: done checking for any_errors_fatal 40074 1727204651.26619: checking for max_fail_percentage 40074 1727204651.26621: done checking for max_fail_percentage 40074 1727204651.26622: checking to see if all hosts have failed and the running result is not ok 40074 1727204651.26624: done checking to see if all hosts have failed 40074 1727204651.26625: getting the remaining hosts for this loop 40074 1727204651.26626: done getting the remaining hosts for this loop 40074 1727204651.26631: getting the next task for host managed-node2 40074 1727204651.26641: done getting next task for host managed-node2 40074 1727204651.26644: ^ task is: TASK: Get NM profile info 40074 1727204651.26651: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204651.26775: getting variables 40074 1727204651.26778: in VariableManager get_vars() 40074 1727204651.26831: Calling all_inventory to load vars for managed-node2 40074 1727204651.26836: Calling groups_inventory to load vars for managed-node2 40074 1727204651.26839: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204651.27004: Calling all_plugins_play to load vars for managed-node2 40074 1727204651.27008: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204651.27012: Calling groups_plugins_play to load vars for managed-node2 40074 1727204651.27615: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b3e 40074 1727204651.27619: WORKER PROCESS EXITING 40074 1727204651.29505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204651.32543: done with get_vars() 40074 1727204651.32587: done getting variables 40074 1727204651.32705: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.082) 0:00:45.089 ***** 40074 1727204651.32744: entering _queue_task() for managed-node2/shell 40074 1727204651.32746: Creating lock for shell 40074 1727204651.33141: worker is 1 (out of 1 available) 40074 1727204651.33156: exiting _queue_task() for managed-node2/shell 40074 1727204651.33170: done queuing things up, now waiting for results queue to drain 40074 1727204651.33172: waiting for pending results... 40074 1727204651.33488: running TaskExecutor() for managed-node2/TASK: Get NM profile info 40074 1727204651.33656: in run() - task 12b410aa-8751-9fd7-2501-000000000b3f 40074 1727204651.33679: variable 'ansible_search_path' from source: unknown 40074 1727204651.33687: variable 'ansible_search_path' from source: unknown 40074 1727204651.33738: calling self._execute() 40074 1727204651.33858: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204651.33871: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204651.33888: variable 'omit' from source: magic vars 40074 1727204651.34352: variable 'ansible_distribution_major_version' from source: facts 40074 1727204651.34371: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204651.34392: variable 'omit' from source: magic vars 40074 1727204651.34470: variable 'omit' from source: magic vars 40074 1727204651.34611: variable 'profile' from source: include params 40074 1727204651.34622: variable 'item' from source: include params 40074 1727204651.34795: variable 'item' from source: include params 40074 1727204651.34798: variable 'omit' from source: magic vars 40074 1727204651.34800: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204651.34850: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204651.34877: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204651.34905: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204651.34931: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204651.34969: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204651.34978: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204651.34986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204651.35125: Set connection var ansible_pipelining to False 40074 1727204651.35144: Set connection var ansible_shell_executable to /bin/sh 40074 1727204651.35154: Set connection var ansible_shell_type to sh 40074 1727204651.35161: Set connection var ansible_connection to ssh 40074 1727204651.35173: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204651.35184: Set connection var ansible_timeout to 10 40074 1727204651.35218: variable 'ansible_shell_executable' from source: unknown 40074 1727204651.35227: variable 'ansible_connection' from source: unknown 40074 1727204651.35251: variable 'ansible_module_compression' from source: unknown 40074 1727204651.35254: variable 'ansible_shell_type' from source: unknown 40074 1727204651.35256: variable 'ansible_shell_executable' from source: unknown 40074 1727204651.35259: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204651.35362: variable 'ansible_pipelining' from source: unknown 40074 1727204651.35366: variable 'ansible_timeout' from source: unknown 40074 1727204651.35368: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204651.35452: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204651.35477: variable 'omit' from source: magic vars 40074 1727204651.35488: starting attempt loop 40074 1727204651.35498: running the handler 40074 1727204651.35513: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204651.35540: _low_level_execute_command(): starting 40074 1727204651.35553: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204651.36342: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204651.36365: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204651.36468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204651.36512: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204651.36536: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204651.36619: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204651.38442: stdout chunk (state=3): >>>/root <<< 40074 1727204651.38563: stdout chunk (state=3): >>><<< 40074 1727204651.38572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204651.38582: stderr chunk (state=3): >>><<< 40074 1727204651.38617: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204651.38639: _low_level_execute_command(): starting 40074 1727204651.38650: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657 `" && echo ansible-tmp-1727204651.3862393-42022-79866941967657="` echo /root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657 `" ) && sleep 0' 40074 1727204651.39273: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204651.39292: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204651.39316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204651.39335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204651.39352: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204651.39364: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204651.39378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204651.39409: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204651.39506: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204651.39534: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204651.39599: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204651.41688: stdout chunk (state=3): >>>ansible-tmp-1727204651.3862393-42022-79866941967657=/root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657 <<< 40074 1727204651.41885: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204651.41902: stdout chunk (state=3): >>><<< 40074 1727204651.41918: stderr chunk (state=3): >>><<< 40074 1727204651.41943: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204651.3862393-42022-79866941967657=/root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204651.41986: variable 'ansible_module_compression' from source: unknown 40074 1727204651.42062: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204651.42299: variable 'ansible_facts' from source: unknown 40074 1727204651.42302: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657/AnsiballZ_command.py 40074 1727204651.42427: Sending initial data 40074 1727204651.42436: Sent initial data (155 bytes) 40074 1727204651.43049: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204651.43073: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204651.43091: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204651.43115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204651.43134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204651.43184: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204651.43262: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204651.43293: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204651.43317: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204651.43398: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204651.45127: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 40074 1727204651.45142: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204651.45206: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204651.45235: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpzd89utjk /root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657/AnsiballZ_command.py <<< 40074 1727204651.45241: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657/AnsiballZ_command.py" <<< 40074 1727204651.45273: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpzd89utjk" to remote "/root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657/AnsiballZ_command.py" <<< 40074 1727204651.46047: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204651.46112: stderr chunk (state=3): >>><<< 40074 1727204651.46116: stdout chunk (state=3): >>><<< 40074 1727204651.46142: done transferring module to remote 40074 1727204651.46156: _low_level_execute_command(): starting 40074 1727204651.46163: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657/ /root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657/AnsiballZ_command.py && sleep 0' 40074 1727204651.46639: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204651.46642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204651.46648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 40074 1727204651.46651: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204651.46653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204651.46694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204651.46697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204651.46746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204651.48627: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204651.48680: stderr chunk (state=3): >>><<< 40074 1727204651.48683: stdout chunk (state=3): >>><<< 40074 1727204651.48700: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204651.48703: _low_level_execute_command(): starting 40074 1727204651.48711: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657/AnsiballZ_command.py && sleep 0' 40074 1727204651.49171: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204651.49174: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204651.49177: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204651.49179: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204651.49228: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204651.49231: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204651.49288: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204651.69023: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:04:11.668880", "end": "2024-09-24 15:04:11.689171", "delta": "0:00:00.020291", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204651.70897: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 40074 1727204651.70902: stdout chunk (state=3): >>><<< 40074 1727204651.70906: stderr chunk (state=3): >>><<< 40074 1727204651.70909: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "start": "2024-09-24 15:04:11.668880", "end": "2024-09-24 15:04:11.689171", "delta": "0:00:00.020291", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 40074 1727204651.70914: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204651.70931: _low_level_execute_command(): starting 40074 1727204651.70937: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204651.3862393-42022-79866941967657/ > /dev/null 2>&1 && sleep 0' 40074 1727204651.71508: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204651.71514: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204651.71545: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204651.71550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204651.71552: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204651.71616: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204651.71624: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204651.71669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204651.73670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204651.73728: stderr chunk (state=3): >>><<< 40074 1727204651.73732: stdout chunk (state=3): >>><<< 40074 1727204651.73747: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204651.73755: handler run complete 40074 1727204651.73777: Evaluated conditional (False): False 40074 1727204651.73788: attempt loop complete, returning result 40074 1727204651.73793: _execute() done 40074 1727204651.73798: dumping result to json 40074 1727204651.73806: done dumping result, returning 40074 1727204651.73814: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [12b410aa-8751-9fd7-2501-000000000b3f] 40074 1727204651.73823: sending task result for task 12b410aa-8751-9fd7-2501-000000000b3f 40074 1727204651.73934: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b3f 40074 1727204651.73939: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest0 | grep /etc", "delta": "0:00:00.020291", "end": "2024-09-24 15:04:11.689171", "rc": 1, "start": "2024-09-24 15:04:11.668880" } MSG: non-zero return code ...ignoring 40074 1727204651.74035: no more pending results, returning what we have 40074 1727204651.74039: results queue empty 40074 1727204651.74040: checking for any_errors_fatal 40074 1727204651.74046: done checking for any_errors_fatal 40074 1727204651.74054: checking for max_fail_percentage 40074 1727204651.74057: done checking for max_fail_percentage 40074 1727204651.74057: checking to see if all hosts have failed and the running result is not ok 40074 1727204651.74059: done checking to see if all hosts have failed 40074 1727204651.74059: getting the remaining hosts for this loop 40074 1727204651.74061: done getting the remaining hosts for this loop 40074 1727204651.74066: getting the next task for host managed-node2 40074 1727204651.74075: done getting next task for host managed-node2 40074 1727204651.74078: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 40074 1727204651.74083: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204651.74087: getting variables 40074 1727204651.74091: in VariableManager get_vars() 40074 1727204651.74136: Calling all_inventory to load vars for managed-node2 40074 1727204651.74139: Calling groups_inventory to load vars for managed-node2 40074 1727204651.74142: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204651.74153: Calling all_plugins_play to load vars for managed-node2 40074 1727204651.74163: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204651.74168: Calling groups_plugins_play to load vars for managed-node2 40074 1727204651.75459: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204651.77197: done with get_vars() 40074 1727204651.77224: done getting variables 40074 1727204651.77274: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.445) 0:00:45.534 ***** 40074 1727204651.77305: entering _queue_task() for managed-node2/set_fact 40074 1727204651.77577: worker is 1 (out of 1 available) 40074 1727204651.77592: exiting _queue_task() for managed-node2/set_fact 40074 1727204651.77607: done queuing things up, now waiting for results queue to drain 40074 1727204651.77608: waiting for pending results... 40074 1727204651.77809: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 40074 1727204651.77904: in run() - task 12b410aa-8751-9fd7-2501-000000000b40 40074 1727204651.77920: variable 'ansible_search_path' from source: unknown 40074 1727204651.77925: variable 'ansible_search_path' from source: unknown 40074 1727204651.77958: calling self._execute() 40074 1727204651.78045: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204651.78051: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204651.78066: variable 'omit' from source: magic vars 40074 1727204651.78388: variable 'ansible_distribution_major_version' from source: facts 40074 1727204651.78404: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204651.78515: variable 'nm_profile_exists' from source: set_fact 40074 1727204651.78528: Evaluated conditional (nm_profile_exists.rc == 0): False 40074 1727204651.78531: when evaluation is False, skipping this task 40074 1727204651.78534: _execute() done 40074 1727204651.78540: dumping result to json 40074 1727204651.78543: done dumping result, returning 40074 1727204651.78551: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-9fd7-2501-000000000b40] 40074 1727204651.78556: sending task result for task 12b410aa-8751-9fd7-2501-000000000b40 40074 1727204651.78652: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b40 40074 1727204651.78655: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 40074 1727204651.78708: no more pending results, returning what we have 40074 1727204651.78712: results queue empty 40074 1727204651.78713: checking for any_errors_fatal 40074 1727204651.78724: done checking for any_errors_fatal 40074 1727204651.78725: checking for max_fail_percentage 40074 1727204651.78727: done checking for max_fail_percentage 40074 1727204651.78728: checking to see if all hosts have failed and the running result is not ok 40074 1727204651.78729: done checking to see if all hosts have failed 40074 1727204651.78730: getting the remaining hosts for this loop 40074 1727204651.78732: done getting the remaining hosts for this loop 40074 1727204651.78736: getting the next task for host managed-node2 40074 1727204651.78746: done getting next task for host managed-node2 40074 1727204651.78750: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 40074 1727204651.78756: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204651.78760: getting variables 40074 1727204651.78762: in VariableManager get_vars() 40074 1727204651.78802: Calling all_inventory to load vars for managed-node2 40074 1727204651.78805: Calling groups_inventory to load vars for managed-node2 40074 1727204651.78807: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204651.78820: Calling all_plugins_play to load vars for managed-node2 40074 1727204651.78824: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204651.78827: Calling groups_plugins_play to load vars for managed-node2 40074 1727204651.80056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204651.81695: done with get_vars() 40074 1727204651.81727: done getting variables 40074 1727204651.81779: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204651.81890: variable 'profile' from source: include params 40074 1727204651.81894: variable 'item' from source: include params 40074 1727204651.81953: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest0] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.046) 0:00:45.581 ***** 40074 1727204651.81980: entering _queue_task() for managed-node2/command 40074 1727204651.82257: worker is 1 (out of 1 available) 40074 1727204651.82271: exiting _queue_task() for managed-node2/command 40074 1727204651.82285: done queuing things up, now waiting for results queue to drain 40074 1727204651.82287: waiting for pending results... 40074 1727204651.82499: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 40074 1727204651.82595: in run() - task 12b410aa-8751-9fd7-2501-000000000b42 40074 1727204651.82608: variable 'ansible_search_path' from source: unknown 40074 1727204651.82612: variable 'ansible_search_path' from source: unknown 40074 1727204651.82649: calling self._execute() 40074 1727204651.82741: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204651.82745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204651.82757: variable 'omit' from source: magic vars 40074 1727204651.83150: variable 'ansible_distribution_major_version' from source: facts 40074 1727204651.83154: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204651.83436: variable 'profile_stat' from source: set_fact 40074 1727204651.83440: Evaluated conditional (profile_stat.stat.exists): False 40074 1727204651.83442: when evaluation is False, skipping this task 40074 1727204651.83444: _execute() done 40074 1727204651.83447: dumping result to json 40074 1727204651.83450: done dumping result, returning 40074 1727204651.83452: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-ethtest0 [12b410aa-8751-9fd7-2501-000000000b42] 40074 1727204651.83454: sending task result for task 12b410aa-8751-9fd7-2501-000000000b42 40074 1727204651.83537: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b42 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 40074 1727204651.83597: no more pending results, returning what we have 40074 1727204651.83601: results queue empty 40074 1727204651.83602: checking for any_errors_fatal 40074 1727204651.83611: done checking for any_errors_fatal 40074 1727204651.83612: checking for max_fail_percentage 40074 1727204651.83614: done checking for max_fail_percentage 40074 1727204651.83615: checking to see if all hosts have failed and the running result is not ok 40074 1727204651.83616: done checking to see if all hosts have failed 40074 1727204651.83617: getting the remaining hosts for this loop 40074 1727204651.83619: done getting the remaining hosts for this loop 40074 1727204651.83623: getting the next task for host managed-node2 40074 1727204651.83631: done getting next task for host managed-node2 40074 1727204651.83634: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 40074 1727204651.83639: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204651.83643: getting variables 40074 1727204651.83644: in VariableManager get_vars() 40074 1727204651.83687: Calling all_inventory to load vars for managed-node2 40074 1727204651.83692: Calling groups_inventory to load vars for managed-node2 40074 1727204651.83695: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204651.83706: Calling all_plugins_play to load vars for managed-node2 40074 1727204651.83709: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204651.83713: Calling groups_plugins_play to load vars for managed-node2 40074 1727204651.84296: WORKER PROCESS EXITING 40074 1727204651.86317: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204651.88726: done with get_vars() 40074 1727204651.88758: done getting variables 40074 1727204651.88816: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204651.88914: variable 'profile' from source: include params 40074 1727204651.88918: variable 'item' from source: include params 40074 1727204651.88967: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest0] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.070) 0:00:45.651 ***** 40074 1727204651.88998: entering _queue_task() for managed-node2/set_fact 40074 1727204651.89277: worker is 1 (out of 1 available) 40074 1727204651.89292: exiting _queue_task() for managed-node2/set_fact 40074 1727204651.89308: done queuing things up, now waiting for results queue to drain 40074 1727204651.89310: waiting for pending results... 40074 1727204651.89512: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 40074 1727204651.89612: in run() - task 12b410aa-8751-9fd7-2501-000000000b43 40074 1727204651.89628: variable 'ansible_search_path' from source: unknown 40074 1727204651.89632: variable 'ansible_search_path' from source: unknown 40074 1727204651.89669: calling self._execute() 40074 1727204651.89762: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204651.89769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204651.89779: variable 'omit' from source: magic vars 40074 1727204651.90155: variable 'ansible_distribution_major_version' from source: facts 40074 1727204651.90164: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204651.90338: variable 'profile_stat' from source: set_fact 40074 1727204651.90342: Evaluated conditional (profile_stat.stat.exists): False 40074 1727204651.90345: when evaluation is False, skipping this task 40074 1727204651.90349: _execute() done 40074 1727204651.90378: dumping result to json 40074 1727204651.90382: done dumping result, returning 40074 1727204651.90385: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest0 [12b410aa-8751-9fd7-2501-000000000b43] 40074 1727204651.90387: sending task result for task 12b410aa-8751-9fd7-2501-000000000b43 40074 1727204651.90459: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b43 40074 1727204651.90464: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 40074 1727204651.90637: no more pending results, returning what we have 40074 1727204651.90640: results queue empty 40074 1727204651.90641: checking for any_errors_fatal 40074 1727204651.90647: done checking for any_errors_fatal 40074 1727204651.90648: checking for max_fail_percentage 40074 1727204651.90649: done checking for max_fail_percentage 40074 1727204651.90650: checking to see if all hosts have failed and the running result is not ok 40074 1727204651.90652: done checking to see if all hosts have failed 40074 1727204651.90652: getting the remaining hosts for this loop 40074 1727204651.90654: done getting the remaining hosts for this loop 40074 1727204651.90657: getting the next task for host managed-node2 40074 1727204651.90664: done getting next task for host managed-node2 40074 1727204651.90667: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 40074 1727204651.90672: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204651.90675: getting variables 40074 1727204651.90677: in VariableManager get_vars() 40074 1727204651.90743: Calling all_inventory to load vars for managed-node2 40074 1727204651.90747: Calling groups_inventory to load vars for managed-node2 40074 1727204651.90750: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204651.90762: Calling all_plugins_play to load vars for managed-node2 40074 1727204651.90766: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204651.90770: Calling groups_plugins_play to load vars for managed-node2 40074 1727204651.92321: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204651.94350: done with get_vars() 40074 1727204651.94375: done getting variables 40074 1727204651.94430: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204651.94525: variable 'profile' from source: include params 40074 1727204651.94529: variable 'item' from source: include params 40074 1727204651.94576: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest0] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:04:11 -0400 (0:00:00.056) 0:00:45.707 ***** 40074 1727204651.94606: entering _queue_task() for managed-node2/command 40074 1727204651.95636: worker is 1 (out of 1 available) 40074 1727204651.95650: exiting _queue_task() for managed-node2/command 40074 1727204651.95664: done queuing things up, now waiting for results queue to drain 40074 1727204651.95666: waiting for pending results... 40074 1727204651.96073: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 40074 1727204651.96078: in run() - task 12b410aa-8751-9fd7-2501-000000000b44 40074 1727204651.96082: variable 'ansible_search_path' from source: unknown 40074 1727204651.96085: variable 'ansible_search_path' from source: unknown 40074 1727204651.96127: calling self._execute() 40074 1727204651.96247: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204651.96261: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204651.96286: variable 'omit' from source: magic vars 40074 1727204651.96741: variable 'ansible_distribution_major_version' from source: facts 40074 1727204651.96760: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204651.96940: variable 'profile_stat' from source: set_fact 40074 1727204651.96959: Evaluated conditional (profile_stat.stat.exists): False 40074 1727204651.96967: when evaluation is False, skipping this task 40074 1727204651.96975: _execute() done 40074 1727204651.96983: dumping result to json 40074 1727204651.97034: done dumping result, returning 40074 1727204651.97040: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-ethtest0 [12b410aa-8751-9fd7-2501-000000000b44] 40074 1727204651.97045: sending task result for task 12b410aa-8751-9fd7-2501-000000000b44 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 40074 1727204651.97212: no more pending results, returning what we have 40074 1727204651.97216: results queue empty 40074 1727204651.97218: checking for any_errors_fatal 40074 1727204651.97226: done checking for any_errors_fatal 40074 1727204651.97227: checking for max_fail_percentage 40074 1727204651.97229: done checking for max_fail_percentage 40074 1727204651.97230: checking to see if all hosts have failed and the running result is not ok 40074 1727204651.97232: done checking to see if all hosts have failed 40074 1727204651.97232: getting the remaining hosts for this loop 40074 1727204651.97234: done getting the remaining hosts for this loop 40074 1727204651.97239: getting the next task for host managed-node2 40074 1727204651.97251: done getting next task for host managed-node2 40074 1727204651.97254: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 40074 1727204651.97261: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204651.97267: getting variables 40074 1727204651.97269: in VariableManager get_vars() 40074 1727204651.97321: Calling all_inventory to load vars for managed-node2 40074 1727204651.97325: Calling groups_inventory to load vars for managed-node2 40074 1727204651.97328: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204651.97343: Calling all_plugins_play to load vars for managed-node2 40074 1727204651.97347: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204651.97351: Calling groups_plugins_play to load vars for managed-node2 40074 1727204651.98006: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b44 40074 1727204651.98010: WORKER PROCESS EXITING 40074 1727204651.99945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204652.03079: done with get_vars() 40074 1727204652.03129: done getting variables 40074 1727204652.03214: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204652.03363: variable 'profile' from source: include params 40074 1727204652.03368: variable 'item' from source: include params 40074 1727204652.03438: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest0] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.088) 0:00:45.796 ***** 40074 1727204652.03481: entering _queue_task() for managed-node2/set_fact 40074 1727204652.03871: worker is 1 (out of 1 available) 40074 1727204652.03885: exiting _queue_task() for managed-node2/set_fact 40074 1727204652.04106: done queuing things up, now waiting for results queue to drain 40074 1727204652.04108: waiting for pending results... 40074 1727204652.04249: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 40074 1727204652.04435: in run() - task 12b410aa-8751-9fd7-2501-000000000b45 40074 1727204652.04467: variable 'ansible_search_path' from source: unknown 40074 1727204652.04479: variable 'ansible_search_path' from source: unknown 40074 1727204652.04537: calling self._execute() 40074 1727204652.04677: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.04695: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.04714: variable 'omit' from source: magic vars 40074 1727204652.05221: variable 'ansible_distribution_major_version' from source: facts 40074 1727204652.05243: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204652.05430: variable 'profile_stat' from source: set_fact 40074 1727204652.05450: Evaluated conditional (profile_stat.stat.exists): False 40074 1727204652.05460: when evaluation is False, skipping this task 40074 1727204652.05467: _execute() done 40074 1727204652.05476: dumping result to json 40074 1727204652.05484: done dumping result, returning 40074 1727204652.05498: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-ethtest0 [12b410aa-8751-9fd7-2501-000000000b45] 40074 1727204652.05508: sending task result for task 12b410aa-8751-9fd7-2501-000000000b45 skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 40074 1727204652.05750: no more pending results, returning what we have 40074 1727204652.05755: results queue empty 40074 1727204652.05756: checking for any_errors_fatal 40074 1727204652.05766: done checking for any_errors_fatal 40074 1727204652.05767: checking for max_fail_percentage 40074 1727204652.05769: done checking for max_fail_percentage 40074 1727204652.05770: checking to see if all hosts have failed and the running result is not ok 40074 1727204652.05772: done checking to see if all hosts have failed 40074 1727204652.05773: getting the remaining hosts for this loop 40074 1727204652.05774: done getting the remaining hosts for this loop 40074 1727204652.05779: getting the next task for host managed-node2 40074 1727204652.05793: done getting next task for host managed-node2 40074 1727204652.05797: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 40074 1727204652.05802: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204652.05808: getting variables 40074 1727204652.05811: in VariableManager get_vars() 40074 1727204652.05978: Calling all_inventory to load vars for managed-node2 40074 1727204652.05981: Calling groups_inventory to load vars for managed-node2 40074 1727204652.05985: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204652.05994: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b45 40074 1727204652.05998: WORKER PROCESS EXITING 40074 1727204652.06012: Calling all_plugins_play to load vars for managed-node2 40074 1727204652.06016: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204652.06021: Calling groups_plugins_play to load vars for managed-node2 40074 1727204652.08695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204652.11851: done with get_vars() 40074 1727204652.11908: done getting variables 40074 1727204652.11986: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204652.12157: variable 'profile' from source: include params 40074 1727204652.12162: variable 'item' from source: include params 40074 1727204652.12249: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest0'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.088) 0:00:45.884 ***** 40074 1727204652.12288: entering _queue_task() for managed-node2/assert 40074 1727204652.12729: worker is 1 (out of 1 available) 40074 1727204652.12746: exiting _queue_task() for managed-node2/assert 40074 1727204652.12761: done queuing things up, now waiting for results queue to drain 40074 1727204652.12763: waiting for pending results... 40074 1727204652.13060: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'ethtest0' 40074 1727204652.13199: in run() - task 12b410aa-8751-9fd7-2501-000000000a6d 40074 1727204652.13225: variable 'ansible_search_path' from source: unknown 40074 1727204652.13230: variable 'ansible_search_path' from source: unknown 40074 1727204652.13269: calling self._execute() 40074 1727204652.13398: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.13407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.13422: variable 'omit' from source: magic vars 40074 1727204652.13915: variable 'ansible_distribution_major_version' from source: facts 40074 1727204652.13932: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204652.13940: variable 'omit' from source: magic vars 40074 1727204652.14005: variable 'omit' from source: magic vars 40074 1727204652.14127: variable 'profile' from source: include params 40074 1727204652.14133: variable 'item' from source: include params 40074 1727204652.14218: variable 'item' from source: include params 40074 1727204652.14294: variable 'omit' from source: magic vars 40074 1727204652.14300: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204652.14344: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204652.14368: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204652.14395: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204652.14407: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204652.14496: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204652.14499: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.14504: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.14600: Set connection var ansible_pipelining to False 40074 1727204652.14608: Set connection var ansible_shell_executable to /bin/sh 40074 1727204652.14611: Set connection var ansible_shell_type to sh 40074 1727204652.14696: Set connection var ansible_connection to ssh 40074 1727204652.14700: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204652.14703: Set connection var ansible_timeout to 10 40074 1727204652.14705: variable 'ansible_shell_executable' from source: unknown 40074 1727204652.14707: variable 'ansible_connection' from source: unknown 40074 1727204652.14709: variable 'ansible_module_compression' from source: unknown 40074 1727204652.14711: variable 'ansible_shell_type' from source: unknown 40074 1727204652.14714: variable 'ansible_shell_executable' from source: unknown 40074 1727204652.14716: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.14718: variable 'ansible_pipelining' from source: unknown 40074 1727204652.14720: variable 'ansible_timeout' from source: unknown 40074 1727204652.14725: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.14888: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204652.14905: variable 'omit' from source: magic vars 40074 1727204652.14948: starting attempt loop 40074 1727204652.14954: running the handler 40074 1727204652.15059: variable 'lsr_net_profile_exists' from source: set_fact 40074 1727204652.15063: Evaluated conditional (not lsr_net_profile_exists): True 40074 1727204652.15073: handler run complete 40074 1727204652.15093: attempt loop complete, returning result 40074 1727204652.15096: _execute() done 40074 1727204652.15099: dumping result to json 40074 1727204652.15104: done dumping result, returning 40074 1727204652.15112: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'ethtest0' [12b410aa-8751-9fd7-2501-000000000a6d] 40074 1727204652.15117: sending task result for task 12b410aa-8751-9fd7-2501-000000000a6d 40074 1727204652.15215: done sending task result for task 12b410aa-8751-9fd7-2501-000000000a6d 40074 1727204652.15218: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 40074 1727204652.15275: no more pending results, returning what we have 40074 1727204652.15279: results queue empty 40074 1727204652.15280: checking for any_errors_fatal 40074 1727204652.15287: done checking for any_errors_fatal 40074 1727204652.15288: checking for max_fail_percentage 40074 1727204652.15295: done checking for max_fail_percentage 40074 1727204652.15297: checking to see if all hosts have failed and the running result is not ok 40074 1727204652.15298: done checking to see if all hosts have failed 40074 1727204652.15299: getting the remaining hosts for this loop 40074 1727204652.15301: done getting the remaining hosts for this loop 40074 1727204652.15305: getting the next task for host managed-node2 40074 1727204652.15316: done getting next task for host managed-node2 40074 1727204652.15320: ^ task is: TASK: Include the task 'get_profile_stat.yml' 40074 1727204652.15324: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204652.15331: getting variables 40074 1727204652.15333: in VariableManager get_vars() 40074 1727204652.15378: Calling all_inventory to load vars for managed-node2 40074 1727204652.15381: Calling groups_inventory to load vars for managed-node2 40074 1727204652.15384: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204652.15404: Calling all_plugins_play to load vars for managed-node2 40074 1727204652.15408: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204652.15412: Calling groups_plugins_play to load vars for managed-node2 40074 1727204652.16695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204652.18992: done with get_vars() 40074 1727204652.19017: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.068) 0:00:45.952 ***** 40074 1727204652.19102: entering _queue_task() for managed-node2/include_tasks 40074 1727204652.19375: worker is 1 (out of 1 available) 40074 1727204652.19392: exiting _queue_task() for managed-node2/include_tasks 40074 1727204652.19408: done queuing things up, now waiting for results queue to drain 40074 1727204652.19410: waiting for pending results... 40074 1727204652.19610: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 40074 1727204652.19708: in run() - task 12b410aa-8751-9fd7-2501-000000000a71 40074 1727204652.19722: variable 'ansible_search_path' from source: unknown 40074 1727204652.19726: variable 'ansible_search_path' from source: unknown 40074 1727204652.19761: calling self._execute() 40074 1727204652.19846: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.19852: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.19866: variable 'omit' from source: magic vars 40074 1727204652.20264: variable 'ansible_distribution_major_version' from source: facts 40074 1727204652.20269: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204652.20272: _execute() done 40074 1727204652.20295: dumping result to json 40074 1727204652.20298: done dumping result, returning 40074 1727204652.20301: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-9fd7-2501-000000000a71] 40074 1727204652.20307: sending task result for task 12b410aa-8751-9fd7-2501-000000000a71 40074 1727204652.20563: done sending task result for task 12b410aa-8751-9fd7-2501-000000000a71 40074 1727204652.20566: WORKER PROCESS EXITING 40074 1727204652.20593: no more pending results, returning what we have 40074 1727204652.20597: in VariableManager get_vars() 40074 1727204652.20638: Calling all_inventory to load vars for managed-node2 40074 1727204652.20641: Calling groups_inventory to load vars for managed-node2 40074 1727204652.20644: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204652.20655: Calling all_plugins_play to load vars for managed-node2 40074 1727204652.20658: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204652.20662: Calling groups_plugins_play to load vars for managed-node2 40074 1727204652.22456: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204652.24525: done with get_vars() 40074 1727204652.24553: variable 'ansible_search_path' from source: unknown 40074 1727204652.24554: variable 'ansible_search_path' from source: unknown 40074 1727204652.24601: we have included files to process 40074 1727204652.24603: generating all_blocks data 40074 1727204652.24606: done generating all_blocks data 40074 1727204652.24613: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 40074 1727204652.24615: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 40074 1727204652.24621: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 40074 1727204652.25823: done processing included file 40074 1727204652.25826: iterating over new_blocks loaded from include file 40074 1727204652.25827: in VariableManager get_vars() 40074 1727204652.25861: done with get_vars() 40074 1727204652.25863: filtering new block on tags 40074 1727204652.25931: done filtering new block on tags 40074 1727204652.25934: in VariableManager get_vars() 40074 1727204652.25951: done with get_vars() 40074 1727204652.25952: filtering new block on tags 40074 1727204652.26005: done filtering new block on tags 40074 1727204652.26008: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 40074 1727204652.26013: extending task lists for all hosts with included blocks 40074 1727204652.26121: done extending task lists 40074 1727204652.26122: done processing included files 40074 1727204652.26123: results queue empty 40074 1727204652.26123: checking for any_errors_fatal 40074 1727204652.26126: done checking for any_errors_fatal 40074 1727204652.26127: checking for max_fail_percentage 40074 1727204652.26128: done checking for max_fail_percentage 40074 1727204652.26128: checking to see if all hosts have failed and the running result is not ok 40074 1727204652.26129: done checking to see if all hosts have failed 40074 1727204652.26130: getting the remaining hosts for this loop 40074 1727204652.26131: done getting the remaining hosts for this loop 40074 1727204652.26133: getting the next task for host managed-node2 40074 1727204652.26136: done getting next task for host managed-node2 40074 1727204652.26138: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 40074 1727204652.26141: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204652.26142: getting variables 40074 1727204652.26143: in VariableManager get_vars() 40074 1727204652.26154: Calling all_inventory to load vars for managed-node2 40074 1727204652.26156: Calling groups_inventory to load vars for managed-node2 40074 1727204652.26157: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204652.26162: Calling all_plugins_play to load vars for managed-node2 40074 1727204652.26164: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204652.26166: Calling groups_plugins_play to load vars for managed-node2 40074 1727204652.27349: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204652.29474: done with get_vars() 40074 1727204652.29501: done getting variables 40074 1727204652.29539: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.104) 0:00:46.057 ***** 40074 1727204652.29565: entering _queue_task() for managed-node2/set_fact 40074 1727204652.29850: worker is 1 (out of 1 available) 40074 1727204652.29867: exiting _queue_task() for managed-node2/set_fact 40074 1727204652.29882: done queuing things up, now waiting for results queue to drain 40074 1727204652.29883: waiting for pending results... 40074 1727204652.30080: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 40074 1727204652.30186: in run() - task 12b410aa-8751-9fd7-2501-000000000b79 40074 1727204652.30200: variable 'ansible_search_path' from source: unknown 40074 1727204652.30204: variable 'ansible_search_path' from source: unknown 40074 1727204652.30240: calling self._execute() 40074 1727204652.30326: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.30333: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.30347: variable 'omit' from source: magic vars 40074 1727204652.30666: variable 'ansible_distribution_major_version' from source: facts 40074 1727204652.30678: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204652.30682: variable 'omit' from source: magic vars 40074 1727204652.30731: variable 'omit' from source: magic vars 40074 1727204652.30760: variable 'omit' from source: magic vars 40074 1727204652.30801: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204652.30834: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204652.30853: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204652.30870: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204652.30882: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204652.30915: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204652.30921: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.30924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.31010: Set connection var ansible_pipelining to False 40074 1727204652.31020: Set connection var ansible_shell_executable to /bin/sh 40074 1727204652.31023: Set connection var ansible_shell_type to sh 40074 1727204652.31026: Set connection var ansible_connection to ssh 40074 1727204652.31031: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204652.31038: Set connection var ansible_timeout to 10 40074 1727204652.31061: variable 'ansible_shell_executable' from source: unknown 40074 1727204652.31064: variable 'ansible_connection' from source: unknown 40074 1727204652.31067: variable 'ansible_module_compression' from source: unknown 40074 1727204652.31070: variable 'ansible_shell_type' from source: unknown 40074 1727204652.31075: variable 'ansible_shell_executable' from source: unknown 40074 1727204652.31078: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.31084: variable 'ansible_pipelining' from source: unknown 40074 1727204652.31086: variable 'ansible_timeout' from source: unknown 40074 1727204652.31093: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.31213: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204652.31229: variable 'omit' from source: magic vars 40074 1727204652.31232: starting attempt loop 40074 1727204652.31235: running the handler 40074 1727204652.31248: handler run complete 40074 1727204652.31257: attempt loop complete, returning result 40074 1727204652.31260: _execute() done 40074 1727204652.31265: dumping result to json 40074 1727204652.31269: done dumping result, returning 40074 1727204652.31277: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-9fd7-2501-000000000b79] 40074 1727204652.31282: sending task result for task 12b410aa-8751-9fd7-2501-000000000b79 ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 40074 1727204652.31438: no more pending results, returning what we have 40074 1727204652.31441: results queue empty 40074 1727204652.31442: checking for any_errors_fatal 40074 1727204652.31444: done checking for any_errors_fatal 40074 1727204652.31445: checking for max_fail_percentage 40074 1727204652.31446: done checking for max_fail_percentage 40074 1727204652.31447: checking to see if all hosts have failed and the running result is not ok 40074 1727204652.31449: done checking to see if all hosts have failed 40074 1727204652.31450: getting the remaining hosts for this loop 40074 1727204652.31451: done getting the remaining hosts for this loop 40074 1727204652.31455: getting the next task for host managed-node2 40074 1727204652.31464: done getting next task for host managed-node2 40074 1727204652.31466: ^ task is: TASK: Stat profile file 40074 1727204652.31473: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204652.31476: getting variables 40074 1727204652.31478: in VariableManager get_vars() 40074 1727204652.31524: Calling all_inventory to load vars for managed-node2 40074 1727204652.31527: Calling groups_inventory to load vars for managed-node2 40074 1727204652.31529: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204652.31540: Calling all_plugins_play to load vars for managed-node2 40074 1727204652.31543: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204652.31547: Calling groups_plugins_play to load vars for managed-node2 40074 1727204652.32106: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b79 40074 1727204652.32110: WORKER PROCESS EXITING 40074 1727204652.32937: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204652.34545: done with get_vars() 40074 1727204652.34568: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.050) 0:00:46.108 ***** 40074 1727204652.34651: entering _queue_task() for managed-node2/stat 40074 1727204652.34922: worker is 1 (out of 1 available) 40074 1727204652.34937: exiting _queue_task() for managed-node2/stat 40074 1727204652.34951: done queuing things up, now waiting for results queue to drain 40074 1727204652.34953: waiting for pending results... 40074 1727204652.35149: running TaskExecutor() for managed-node2/TASK: Stat profile file 40074 1727204652.35242: in run() - task 12b410aa-8751-9fd7-2501-000000000b7a 40074 1727204652.35254: variable 'ansible_search_path' from source: unknown 40074 1727204652.35258: variable 'ansible_search_path' from source: unknown 40074 1727204652.35290: calling self._execute() 40074 1727204652.35378: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.35386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.35400: variable 'omit' from source: magic vars 40074 1727204652.35720: variable 'ansible_distribution_major_version' from source: facts 40074 1727204652.35731: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204652.35734: variable 'omit' from source: magic vars 40074 1727204652.35782: variable 'omit' from source: magic vars 40074 1727204652.35866: variable 'profile' from source: include params 40074 1727204652.35870: variable 'item' from source: include params 40074 1727204652.35926: variable 'item' from source: include params 40074 1727204652.35949: variable 'omit' from source: magic vars 40074 1727204652.35982: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204652.36015: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204652.36033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204652.36051: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204652.36067: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204652.36093: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204652.36098: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.36103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.36191: Set connection var ansible_pipelining to False 40074 1727204652.36198: Set connection var ansible_shell_executable to /bin/sh 40074 1727204652.36202: Set connection var ansible_shell_type to sh 40074 1727204652.36204: Set connection var ansible_connection to ssh 40074 1727204652.36212: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204652.36220: Set connection var ansible_timeout to 10 40074 1727204652.36242: variable 'ansible_shell_executable' from source: unknown 40074 1727204652.36245: variable 'ansible_connection' from source: unknown 40074 1727204652.36247: variable 'ansible_module_compression' from source: unknown 40074 1727204652.36252: variable 'ansible_shell_type' from source: unknown 40074 1727204652.36254: variable 'ansible_shell_executable' from source: unknown 40074 1727204652.36259: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.36264: variable 'ansible_pipelining' from source: unknown 40074 1727204652.36267: variable 'ansible_timeout' from source: unknown 40074 1727204652.36277: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.36445: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 40074 1727204652.36456: variable 'omit' from source: magic vars 40074 1727204652.36463: starting attempt loop 40074 1727204652.36466: running the handler 40074 1727204652.36479: _low_level_execute_command(): starting 40074 1727204652.36486: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204652.37048: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204652.37052: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.37055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204652.37057: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204652.37060: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.37113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204652.37116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204652.37119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204652.37174: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204652.38975: stdout chunk (state=3): >>>/root <<< 40074 1727204652.39086: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204652.39156: stderr chunk (state=3): >>><<< 40074 1727204652.39158: stdout chunk (state=3): >>><<< 40074 1727204652.39173: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204652.39188: _low_level_execute_command(): starting 40074 1727204652.39202: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331 `" && echo ansible-tmp-1727204652.3917797-42061-11021307287331="` echo /root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331 `" ) && sleep 0' 40074 1727204652.39685: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204652.39688: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.39701: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 40074 1727204652.39704: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.39747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204652.39756: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204652.39803: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204652.41925: stdout chunk (state=3): >>>ansible-tmp-1727204652.3917797-42061-11021307287331=/root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331 <<< 40074 1727204652.42044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204652.42098: stderr chunk (state=3): >>><<< 40074 1727204652.42102: stdout chunk (state=3): >>><<< 40074 1727204652.42119: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204652.3917797-42061-11021307287331=/root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204652.42167: variable 'ansible_module_compression' from source: unknown 40074 1727204652.42216: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 40074 1727204652.42250: variable 'ansible_facts' from source: unknown 40074 1727204652.42317: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331/AnsiballZ_stat.py 40074 1727204652.42438: Sending initial data 40074 1727204652.42442: Sent initial data (152 bytes) 40074 1727204652.42893: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204652.42929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204652.42932: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 40074 1727204652.42936: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.42986: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204652.42991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204652.43039: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204652.44800: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 40074 1727204652.44805: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204652.44838: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204652.44876: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp3otxaqm5 /root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331/AnsiballZ_stat.py <<< 40074 1727204652.44882: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331/AnsiballZ_stat.py" <<< 40074 1727204652.44915: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmp3otxaqm5" to remote "/root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331/AnsiballZ_stat.py" <<< 40074 1727204652.44919: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331/AnsiballZ_stat.py" <<< 40074 1727204652.45702: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204652.45774: stderr chunk (state=3): >>><<< 40074 1727204652.45778: stdout chunk (state=3): >>><<< 40074 1727204652.45800: done transferring module to remote 40074 1727204652.45813: _low_level_execute_command(): starting 40074 1727204652.45818: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331/ /root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331/AnsiballZ_stat.py && sleep 0' 40074 1727204652.46265: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204652.46311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204652.46314: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204652.46319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 40074 1727204652.46322: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204652.46328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.46375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204652.46378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204652.46419: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204652.48496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204652.48500: stdout chunk (state=3): >>><<< 40074 1727204652.48503: stderr chunk (state=3): >>><<< 40074 1727204652.48506: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204652.48508: _low_level_execute_command(): starting 40074 1727204652.48511: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331/AnsiballZ_stat.py && sleep 0' 40074 1727204652.49182: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204652.49203: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204652.49245: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.49356: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204652.49379: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204652.49405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204652.49496: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204652.67348: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 40074 1727204652.68839: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204652.68891: stderr chunk (state=3): >>><<< 40074 1727204652.68895: stdout chunk (state=3): >>><<< 40074 1727204652.68909: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-ethtest1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204652.68941: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-ethtest1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204652.68951: _low_level_execute_command(): starting 40074 1727204652.68960: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204652.3917797-42061-11021307287331/ > /dev/null 2>&1 && sleep 0' 40074 1727204652.69428: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204652.69431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.69434: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204652.69436: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.69484: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204652.69496: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204652.69538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204652.71528: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204652.71574: stderr chunk (state=3): >>><<< 40074 1727204652.71578: stdout chunk (state=3): >>><<< 40074 1727204652.71597: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204652.71605: handler run complete 40074 1727204652.71627: attempt loop complete, returning result 40074 1727204652.71630: _execute() done 40074 1727204652.71633: dumping result to json 40074 1727204652.71638: done dumping result, returning 40074 1727204652.71647: done running TaskExecutor() for managed-node2/TASK: Stat profile file [12b410aa-8751-9fd7-2501-000000000b7a] 40074 1727204652.71652: sending task result for task 12b410aa-8751-9fd7-2501-000000000b7a 40074 1727204652.71753: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b7a 40074 1727204652.71756: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 40074 1727204652.71851: no more pending results, returning what we have 40074 1727204652.71854: results queue empty 40074 1727204652.71855: checking for any_errors_fatal 40074 1727204652.71863: done checking for any_errors_fatal 40074 1727204652.71864: checking for max_fail_percentage 40074 1727204652.71868: done checking for max_fail_percentage 40074 1727204652.71869: checking to see if all hosts have failed and the running result is not ok 40074 1727204652.71870: done checking to see if all hosts have failed 40074 1727204652.71871: getting the remaining hosts for this loop 40074 1727204652.71872: done getting the remaining hosts for this loop 40074 1727204652.71877: getting the next task for host managed-node2 40074 1727204652.71885: done getting next task for host managed-node2 40074 1727204652.71888: ^ task is: TASK: Set NM profile exist flag based on the profile files 40074 1727204652.71895: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204652.71899: getting variables 40074 1727204652.71901: in VariableManager get_vars() 40074 1727204652.71943: Calling all_inventory to load vars for managed-node2 40074 1727204652.71946: Calling groups_inventory to load vars for managed-node2 40074 1727204652.71950: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204652.71961: Calling all_plugins_play to load vars for managed-node2 40074 1727204652.71964: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204652.71967: Calling groups_plugins_play to load vars for managed-node2 40074 1727204652.73237: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204652.74839: done with get_vars() 40074 1727204652.74863: done getting variables 40074 1727204652.74918: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.402) 0:00:46.511 ***** 40074 1727204652.74949: entering _queue_task() for managed-node2/set_fact 40074 1727204652.75215: worker is 1 (out of 1 available) 40074 1727204652.75229: exiting _queue_task() for managed-node2/set_fact 40074 1727204652.75244: done queuing things up, now waiting for results queue to drain 40074 1727204652.75246: waiting for pending results... 40074 1727204652.75450: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 40074 1727204652.75556: in run() - task 12b410aa-8751-9fd7-2501-000000000b7b 40074 1727204652.75568: variable 'ansible_search_path' from source: unknown 40074 1727204652.75580: variable 'ansible_search_path' from source: unknown 40074 1727204652.75609: calling self._execute() 40074 1727204652.75698: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.75708: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.75715: variable 'omit' from source: magic vars 40074 1727204652.76038: variable 'ansible_distribution_major_version' from source: facts 40074 1727204652.76049: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204652.76156: variable 'profile_stat' from source: set_fact 40074 1727204652.76167: Evaluated conditional (profile_stat.stat.exists): False 40074 1727204652.76170: when evaluation is False, skipping this task 40074 1727204652.76173: _execute() done 40074 1727204652.76179: dumping result to json 40074 1727204652.76182: done dumping result, returning 40074 1727204652.76190: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-9fd7-2501-000000000b7b] 40074 1727204652.76196: sending task result for task 12b410aa-8751-9fd7-2501-000000000b7b 40074 1727204652.76287: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b7b 40074 1727204652.76292: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 40074 1727204652.76347: no more pending results, returning what we have 40074 1727204652.76352: results queue empty 40074 1727204652.76354: checking for any_errors_fatal 40074 1727204652.76363: done checking for any_errors_fatal 40074 1727204652.76364: checking for max_fail_percentage 40074 1727204652.76365: done checking for max_fail_percentage 40074 1727204652.76367: checking to see if all hosts have failed and the running result is not ok 40074 1727204652.76368: done checking to see if all hosts have failed 40074 1727204652.76369: getting the remaining hosts for this loop 40074 1727204652.76370: done getting the remaining hosts for this loop 40074 1727204652.76374: getting the next task for host managed-node2 40074 1727204652.76382: done getting next task for host managed-node2 40074 1727204652.76385: ^ task is: TASK: Get NM profile info 40074 1727204652.76392: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204652.76396: getting variables 40074 1727204652.76398: in VariableManager get_vars() 40074 1727204652.76438: Calling all_inventory to load vars for managed-node2 40074 1727204652.76441: Calling groups_inventory to load vars for managed-node2 40074 1727204652.76444: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204652.76454: Calling all_plugins_play to load vars for managed-node2 40074 1727204652.76457: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204652.76461: Calling groups_plugins_play to load vars for managed-node2 40074 1727204652.81560: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204652.83144: done with get_vars() 40074 1727204652.83168: done getting variables 40074 1727204652.83211: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 15:04:12 -0400 (0:00:00.082) 0:00:46.594 ***** 40074 1727204652.83235: entering _queue_task() for managed-node2/shell 40074 1727204652.83517: worker is 1 (out of 1 available) 40074 1727204652.83532: exiting _queue_task() for managed-node2/shell 40074 1727204652.83546: done queuing things up, now waiting for results queue to drain 40074 1727204652.83549: waiting for pending results... 40074 1727204652.83756: running TaskExecutor() for managed-node2/TASK: Get NM profile info 40074 1727204652.83864: in run() - task 12b410aa-8751-9fd7-2501-000000000b7c 40074 1727204652.83881: variable 'ansible_search_path' from source: unknown 40074 1727204652.83885: variable 'ansible_search_path' from source: unknown 40074 1727204652.83920: calling self._execute() 40074 1727204652.84009: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.84018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.84033: variable 'omit' from source: magic vars 40074 1727204652.84374: variable 'ansible_distribution_major_version' from source: facts 40074 1727204652.84385: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204652.84393: variable 'omit' from source: magic vars 40074 1727204652.84445: variable 'omit' from source: magic vars 40074 1727204652.84534: variable 'profile' from source: include params 40074 1727204652.84538: variable 'item' from source: include params 40074 1727204652.84599: variable 'item' from source: include params 40074 1727204652.84615: variable 'omit' from source: magic vars 40074 1727204652.84658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204652.84693: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204652.84712: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204652.84731: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204652.84742: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204652.84777: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204652.84782: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.84784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.84870: Set connection var ansible_pipelining to False 40074 1727204652.84878: Set connection var ansible_shell_executable to /bin/sh 40074 1727204652.84881: Set connection var ansible_shell_type to sh 40074 1727204652.84883: Set connection var ansible_connection to ssh 40074 1727204652.84893: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204652.84901: Set connection var ansible_timeout to 10 40074 1727204652.84930: variable 'ansible_shell_executable' from source: unknown 40074 1727204652.84933: variable 'ansible_connection' from source: unknown 40074 1727204652.84935: variable 'ansible_module_compression' from source: unknown 40074 1727204652.84940: variable 'ansible_shell_type' from source: unknown 40074 1727204652.84942: variable 'ansible_shell_executable' from source: unknown 40074 1727204652.84947: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204652.84952: variable 'ansible_pipelining' from source: unknown 40074 1727204652.84955: variable 'ansible_timeout' from source: unknown 40074 1727204652.84961: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204652.85078: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204652.85090: variable 'omit' from source: magic vars 40074 1727204652.85097: starting attempt loop 40074 1727204652.85105: running the handler 40074 1727204652.85114: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204652.85133: _low_level_execute_command(): starting 40074 1727204652.85140: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204652.85667: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204652.85705: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.85710: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204652.85712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.85769: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204652.85772: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204652.85777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204652.85827: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204652.87611: stdout chunk (state=3): >>>/root <<< 40074 1727204652.87721: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204652.87776: stderr chunk (state=3): >>><<< 40074 1727204652.87779: stdout chunk (state=3): >>><<< 40074 1727204652.87881: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204652.87885: _low_level_execute_command(): starting 40074 1727204652.87888: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368 `" && echo ansible-tmp-1727204652.878078-42078-281026075520368="` echo /root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368 `" ) && sleep 0' 40074 1727204652.89298: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.89429: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204652.89432: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204652.89868: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204652.91976: stdout chunk (state=3): >>>ansible-tmp-1727204652.878078-42078-281026075520368=/root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368 <<< 40074 1727204652.92166: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204652.92183: stdout chunk (state=3): >>><<< 40074 1727204652.92203: stderr chunk (state=3): >>><<< 40074 1727204652.92232: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204652.878078-42078-281026075520368=/root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204652.92394: variable 'ansible_module_compression' from source: unknown 40074 1727204652.92397: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204652.92400: variable 'ansible_facts' from source: unknown 40074 1727204652.92770: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368/AnsiballZ_command.py 40074 1727204652.93008: Sending initial data 40074 1727204652.93021: Sent initial data (155 bytes) 40074 1727204652.93470: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204652.93485: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204652.93505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204652.93529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204652.93551: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204652.93564: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204652.93578: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.93658: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.93701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204652.93721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204652.93753: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204652.93822: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204652.95557: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204652.95619: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204652.95666: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpf7lper6e /root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368/AnsiballZ_command.py <<< 40074 1727204652.95670: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368/AnsiballZ_command.py" <<< 40074 1727204652.95721: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpf7lper6e" to remote "/root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368/AnsiballZ_command.py" <<< 40074 1727204652.96795: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204652.96882: stderr chunk (state=3): >>><<< 40074 1727204652.96893: stdout chunk (state=3): >>><<< 40074 1727204652.96985: done transferring module to remote 40074 1727204652.96988: _low_level_execute_command(): starting 40074 1727204652.96992: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368/ /root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368/AnsiballZ_command.py && sleep 0' 40074 1727204652.97634: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204652.97757: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204652.97778: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204652.97806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204652.97883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204652.99880: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204652.99905: stdout chunk (state=3): >>><<< 40074 1727204652.99924: stderr chunk (state=3): >>><<< 40074 1727204652.99947: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204652.99958: _low_level_execute_command(): starting 40074 1727204653.00007: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368/AnsiballZ_command.py && sleep 0' 40074 1727204653.00660: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204653.00680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204653.00702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204653.00785: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204653.00839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204653.00858: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204653.00889: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204653.00973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204653.20598: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-24 15:04:13.184385", "end": "2024-09-24 15:04:13.204742", "delta": "0:00:00.020357", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204653.22456: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 40074 1727204653.22461: stdout chunk (state=3): >>><<< 40074 1727204653.22463: stderr chunk (state=3): >>><<< 40074 1727204653.22484: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "start": "2024-09-24 15:04:13.184385", "end": "2024-09-24 15:04:13.204742", "delta": "0:00:00.020357", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 40074 1727204653.22604: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204653.22608: _low_level_execute_command(): starting 40074 1727204653.22610: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204652.878078-42078-281026075520368/ > /dev/null 2>&1 && sleep 0' 40074 1727204653.23307: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204653.23434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204653.23473: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204653.23555: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204653.25655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204653.25684: stdout chunk (state=3): >>><<< 40074 1727204653.25687: stderr chunk (state=3): >>><<< 40074 1727204653.25710: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204653.25895: handler run complete 40074 1727204653.25899: Evaluated conditional (False): False 40074 1727204653.25901: attempt loop complete, returning result 40074 1727204653.25904: _execute() done 40074 1727204653.25907: dumping result to json 40074 1727204653.25909: done dumping result, returning 40074 1727204653.25911: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [12b410aa-8751-9fd7-2501-000000000b7c] 40074 1727204653.25914: sending task result for task 12b410aa-8751-9fd7-2501-000000000b7c 40074 1727204653.26003: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b7c 40074 1727204653.26006: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep ethtest1 | grep /etc", "delta": "0:00:00.020357", "end": "2024-09-24 15:04:13.204742", "rc": 1, "start": "2024-09-24 15:04:13.184385" } MSG: non-zero return code ...ignoring 40074 1727204653.26114: no more pending results, returning what we have 40074 1727204653.26122: results queue empty 40074 1727204653.26123: checking for any_errors_fatal 40074 1727204653.26133: done checking for any_errors_fatal 40074 1727204653.26134: checking for max_fail_percentage 40074 1727204653.26136: done checking for max_fail_percentage 40074 1727204653.26137: checking to see if all hosts have failed and the running result is not ok 40074 1727204653.26139: done checking to see if all hosts have failed 40074 1727204653.26140: getting the remaining hosts for this loop 40074 1727204653.26142: done getting the remaining hosts for this loop 40074 1727204653.26146: getting the next task for host managed-node2 40074 1727204653.26157: done getting next task for host managed-node2 40074 1727204653.26160: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 40074 1727204653.26166: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204653.26172: getting variables 40074 1727204653.26174: in VariableManager get_vars() 40074 1727204653.26346: Calling all_inventory to load vars for managed-node2 40074 1727204653.26351: Calling groups_inventory to load vars for managed-node2 40074 1727204653.26354: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204653.26369: Calling all_plugins_play to load vars for managed-node2 40074 1727204653.26372: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204653.26376: Calling groups_plugins_play to load vars for managed-node2 40074 1727204653.29060: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204653.31542: done with get_vars() 40074 1727204653.31570: done getting variables 40074 1727204653.31625: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.484) 0:00:47.078 ***** 40074 1727204653.31655: entering _queue_task() for managed-node2/set_fact 40074 1727204653.31946: worker is 1 (out of 1 available) 40074 1727204653.31962: exiting _queue_task() for managed-node2/set_fact 40074 1727204653.31977: done queuing things up, now waiting for results queue to drain 40074 1727204653.31979: waiting for pending results... 40074 1727204653.32210: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 40074 1727204653.32628: in run() - task 12b410aa-8751-9fd7-2501-000000000b7d 40074 1727204653.32633: variable 'ansible_search_path' from source: unknown 40074 1727204653.32636: variable 'ansible_search_path' from source: unknown 40074 1727204653.32639: calling self._execute() 40074 1727204653.32643: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204653.32646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204653.32648: variable 'omit' from source: magic vars 40074 1727204653.33056: variable 'ansible_distribution_major_version' from source: facts 40074 1727204653.33068: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204653.33249: variable 'nm_profile_exists' from source: set_fact 40074 1727204653.33264: Evaluated conditional (nm_profile_exists.rc == 0): False 40074 1727204653.33268: when evaluation is False, skipping this task 40074 1727204653.33272: _execute() done 40074 1727204653.33284: dumping result to json 40074 1727204653.33290: done dumping result, returning 40074 1727204653.33295: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-9fd7-2501-000000000b7d] 40074 1727204653.33298: sending task result for task 12b410aa-8751-9fd7-2501-000000000b7d skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 40074 1727204653.33504: no more pending results, returning what we have 40074 1727204653.33508: results queue empty 40074 1727204653.33509: checking for any_errors_fatal 40074 1727204653.33520: done checking for any_errors_fatal 40074 1727204653.33521: checking for max_fail_percentage 40074 1727204653.33523: done checking for max_fail_percentage 40074 1727204653.33524: checking to see if all hosts have failed and the running result is not ok 40074 1727204653.33526: done checking to see if all hosts have failed 40074 1727204653.33526: getting the remaining hosts for this loop 40074 1727204653.33528: done getting the remaining hosts for this loop 40074 1727204653.33532: getting the next task for host managed-node2 40074 1727204653.33544: done getting next task for host managed-node2 40074 1727204653.33547: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 40074 1727204653.33554: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204653.33558: getting variables 40074 1727204653.33560: in VariableManager get_vars() 40074 1727204653.33601: Calling all_inventory to load vars for managed-node2 40074 1727204653.33605: Calling groups_inventory to load vars for managed-node2 40074 1727204653.33607: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204653.33618: Calling all_plugins_play to load vars for managed-node2 40074 1727204653.33621: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204653.33625: Calling groups_plugins_play to load vars for managed-node2 40074 1727204653.34297: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b7d 40074 1727204653.34300: WORKER PROCESS EXITING 40074 1727204653.35433: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204653.37339: done with get_vars() 40074 1727204653.37379: done getting variables 40074 1727204653.37455: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204653.37604: variable 'profile' from source: include params 40074 1727204653.37609: variable 'item' from source: include params 40074 1727204653.37688: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-ethtest1] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.060) 0:00:47.139 ***** 40074 1727204653.37734: entering _queue_task() for managed-node2/command 40074 1727204653.38059: worker is 1 (out of 1 available) 40074 1727204653.38073: exiting _queue_task() for managed-node2/command 40074 1727204653.38087: done queuing things up, now waiting for results queue to drain 40074 1727204653.38090: waiting for pending results... 40074 1727204653.38303: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-ethtest1 40074 1727204653.38418: in run() - task 12b410aa-8751-9fd7-2501-000000000b7f 40074 1727204653.38439: variable 'ansible_search_path' from source: unknown 40074 1727204653.38444: variable 'ansible_search_path' from source: unknown 40074 1727204653.38471: calling self._execute() 40074 1727204653.38562: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204653.38569: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204653.38579: variable 'omit' from source: magic vars 40074 1727204653.38906: variable 'ansible_distribution_major_version' from source: facts 40074 1727204653.38917: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204653.39027: variable 'profile_stat' from source: set_fact 40074 1727204653.39039: Evaluated conditional (profile_stat.stat.exists): False 40074 1727204653.39042: when evaluation is False, skipping this task 40074 1727204653.39045: _execute() done 40074 1727204653.39050: dumping result to json 40074 1727204653.39055: done dumping result, returning 40074 1727204653.39061: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-ethtest1 [12b410aa-8751-9fd7-2501-000000000b7f] 40074 1727204653.39066: sending task result for task 12b410aa-8751-9fd7-2501-000000000b7f 40074 1727204653.39162: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b7f 40074 1727204653.39165: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 40074 1727204653.39240: no more pending results, returning what we have 40074 1727204653.39244: results queue empty 40074 1727204653.39245: checking for any_errors_fatal 40074 1727204653.39252: done checking for any_errors_fatal 40074 1727204653.39253: checking for max_fail_percentage 40074 1727204653.39254: done checking for max_fail_percentage 40074 1727204653.39256: checking to see if all hosts have failed and the running result is not ok 40074 1727204653.39257: done checking to see if all hosts have failed 40074 1727204653.39258: getting the remaining hosts for this loop 40074 1727204653.39259: done getting the remaining hosts for this loop 40074 1727204653.39264: getting the next task for host managed-node2 40074 1727204653.39271: done getting next task for host managed-node2 40074 1727204653.39274: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 40074 1727204653.39280: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204653.39284: getting variables 40074 1727204653.39285: in VariableManager get_vars() 40074 1727204653.39326: Calling all_inventory to load vars for managed-node2 40074 1727204653.39329: Calling groups_inventory to load vars for managed-node2 40074 1727204653.39331: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204653.39342: Calling all_plugins_play to load vars for managed-node2 40074 1727204653.39345: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204653.39349: Calling groups_plugins_play to load vars for managed-node2 40074 1727204653.40748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204653.42350: done with get_vars() 40074 1727204653.42372: done getting variables 40074 1727204653.42423: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204653.42512: variable 'profile' from source: include params 40074 1727204653.42515: variable 'item' from source: include params 40074 1727204653.42567: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-ethtest1] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.048) 0:00:47.187 ***** 40074 1727204653.42595: entering _queue_task() for managed-node2/set_fact 40074 1727204653.42844: worker is 1 (out of 1 available) 40074 1727204653.42859: exiting _queue_task() for managed-node2/set_fact 40074 1727204653.42873: done queuing things up, now waiting for results queue to drain 40074 1727204653.42874: waiting for pending results... 40074 1727204653.43071: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 40074 1727204653.43186: in run() - task 12b410aa-8751-9fd7-2501-000000000b80 40074 1727204653.43200: variable 'ansible_search_path' from source: unknown 40074 1727204653.43203: variable 'ansible_search_path' from source: unknown 40074 1727204653.43239: calling self._execute() 40074 1727204653.43328: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204653.43336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204653.43348: variable 'omit' from source: magic vars 40074 1727204653.43672: variable 'ansible_distribution_major_version' from source: facts 40074 1727204653.43682: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204653.43785: variable 'profile_stat' from source: set_fact 40074 1727204653.43798: Evaluated conditional (profile_stat.stat.exists): False 40074 1727204653.43802: when evaluation is False, skipping this task 40074 1727204653.43804: _execute() done 40074 1727204653.43811: dumping result to json 40074 1727204653.43814: done dumping result, returning 40074 1727204653.43823: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-ethtest1 [12b410aa-8751-9fd7-2501-000000000b80] 40074 1727204653.43828: sending task result for task 12b410aa-8751-9fd7-2501-000000000b80 40074 1727204653.43922: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b80 40074 1727204653.43925: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 40074 1727204653.43976: no more pending results, returning what we have 40074 1727204653.43980: results queue empty 40074 1727204653.43981: checking for any_errors_fatal 40074 1727204653.43992: done checking for any_errors_fatal 40074 1727204653.43993: checking for max_fail_percentage 40074 1727204653.43995: done checking for max_fail_percentage 40074 1727204653.43996: checking to see if all hosts have failed and the running result is not ok 40074 1727204653.43997: done checking to see if all hosts have failed 40074 1727204653.43998: getting the remaining hosts for this loop 40074 1727204653.44000: done getting the remaining hosts for this loop 40074 1727204653.44004: getting the next task for host managed-node2 40074 1727204653.44013: done getting next task for host managed-node2 40074 1727204653.44019: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 40074 1727204653.44025: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204653.44029: getting variables 40074 1727204653.44031: in VariableManager get_vars() 40074 1727204653.44069: Calling all_inventory to load vars for managed-node2 40074 1727204653.44072: Calling groups_inventory to load vars for managed-node2 40074 1727204653.44074: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204653.44085: Calling all_plugins_play to load vars for managed-node2 40074 1727204653.44088: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204653.44101: Calling groups_plugins_play to load vars for managed-node2 40074 1727204653.45336: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204653.47063: done with get_vars() 40074 1727204653.47085: done getting variables 40074 1727204653.47143: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204653.47238: variable 'profile' from source: include params 40074 1727204653.47242: variable 'item' from source: include params 40074 1727204653.47292: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-ethtest1] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.047) 0:00:47.234 ***** 40074 1727204653.47321: entering _queue_task() for managed-node2/command 40074 1727204653.47584: worker is 1 (out of 1 available) 40074 1727204653.47599: exiting _queue_task() for managed-node2/command 40074 1727204653.47615: done queuing things up, now waiting for results queue to drain 40074 1727204653.47619: waiting for pending results... 40074 1727204653.47826: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-ethtest1 40074 1727204653.47931: in run() - task 12b410aa-8751-9fd7-2501-000000000b81 40074 1727204653.47944: variable 'ansible_search_path' from source: unknown 40074 1727204653.47947: variable 'ansible_search_path' from source: unknown 40074 1727204653.47981: calling self._execute() 40074 1727204653.48075: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204653.48082: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204653.48094: variable 'omit' from source: magic vars 40074 1727204653.48420: variable 'ansible_distribution_major_version' from source: facts 40074 1727204653.48429: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204653.48536: variable 'profile_stat' from source: set_fact 40074 1727204653.48547: Evaluated conditional (profile_stat.stat.exists): False 40074 1727204653.48551: when evaluation is False, skipping this task 40074 1727204653.48554: _execute() done 40074 1727204653.48559: dumping result to json 40074 1727204653.48563: done dumping result, returning 40074 1727204653.48570: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-ethtest1 [12b410aa-8751-9fd7-2501-000000000b81] 40074 1727204653.48577: sending task result for task 12b410aa-8751-9fd7-2501-000000000b81 40074 1727204653.48672: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b81 40074 1727204653.48674: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 40074 1727204653.48763: no more pending results, returning what we have 40074 1727204653.48766: results queue empty 40074 1727204653.48767: checking for any_errors_fatal 40074 1727204653.48772: done checking for any_errors_fatal 40074 1727204653.48773: checking for max_fail_percentage 40074 1727204653.48775: done checking for max_fail_percentage 40074 1727204653.48776: checking to see if all hosts have failed and the running result is not ok 40074 1727204653.48777: done checking to see if all hosts have failed 40074 1727204653.48778: getting the remaining hosts for this loop 40074 1727204653.48779: done getting the remaining hosts for this loop 40074 1727204653.48784: getting the next task for host managed-node2 40074 1727204653.48793: done getting next task for host managed-node2 40074 1727204653.48796: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 40074 1727204653.48802: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204653.48806: getting variables 40074 1727204653.48808: in VariableManager get_vars() 40074 1727204653.48849: Calling all_inventory to load vars for managed-node2 40074 1727204653.48852: Calling groups_inventory to load vars for managed-node2 40074 1727204653.48855: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204653.48865: Calling all_plugins_play to load vars for managed-node2 40074 1727204653.48868: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204653.48876: Calling groups_plugins_play to load vars for managed-node2 40074 1727204653.50084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204653.51691: done with get_vars() 40074 1727204653.51718: done getting variables 40074 1727204653.51766: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204653.51857: variable 'profile' from source: include params 40074 1727204653.51860: variable 'item' from source: include params 40074 1727204653.51909: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-ethtest1] ************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.046) 0:00:47.281 ***** 40074 1727204653.51941: entering _queue_task() for managed-node2/set_fact 40074 1727204653.52191: worker is 1 (out of 1 available) 40074 1727204653.52207: exiting _queue_task() for managed-node2/set_fact 40074 1727204653.52223: done queuing things up, now waiting for results queue to drain 40074 1727204653.52225: waiting for pending results... 40074 1727204653.52427: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-ethtest1 40074 1727204653.52524: in run() - task 12b410aa-8751-9fd7-2501-000000000b82 40074 1727204653.52541: variable 'ansible_search_path' from source: unknown 40074 1727204653.52545: variable 'ansible_search_path' from source: unknown 40074 1727204653.52577: calling self._execute() 40074 1727204653.52671: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204653.52675: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204653.52687: variable 'omit' from source: magic vars 40074 1727204653.53008: variable 'ansible_distribution_major_version' from source: facts 40074 1727204653.53021: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204653.53123: variable 'profile_stat' from source: set_fact 40074 1727204653.53135: Evaluated conditional (profile_stat.stat.exists): False 40074 1727204653.53141: when evaluation is False, skipping this task 40074 1727204653.53145: _execute() done 40074 1727204653.53148: dumping result to json 40074 1727204653.53151: done dumping result, returning 40074 1727204653.53160: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-ethtest1 [12b410aa-8751-9fd7-2501-000000000b82] 40074 1727204653.53165: sending task result for task 12b410aa-8751-9fd7-2501-000000000b82 40074 1727204653.53259: done sending task result for task 12b410aa-8751-9fd7-2501-000000000b82 40074 1727204653.53262: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 40074 1727204653.53322: no more pending results, returning what we have 40074 1727204653.53325: results queue empty 40074 1727204653.53326: checking for any_errors_fatal 40074 1727204653.53332: done checking for any_errors_fatal 40074 1727204653.53333: checking for max_fail_percentage 40074 1727204653.53335: done checking for max_fail_percentage 40074 1727204653.53336: checking to see if all hosts have failed and the running result is not ok 40074 1727204653.53337: done checking to see if all hosts have failed 40074 1727204653.53338: getting the remaining hosts for this loop 40074 1727204653.53339: done getting the remaining hosts for this loop 40074 1727204653.53343: getting the next task for host managed-node2 40074 1727204653.53352: done getting next task for host managed-node2 40074 1727204653.53355: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 40074 1727204653.53360: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204653.53364: getting variables 40074 1727204653.53366: in VariableManager get_vars() 40074 1727204653.53406: Calling all_inventory to load vars for managed-node2 40074 1727204653.53409: Calling groups_inventory to load vars for managed-node2 40074 1727204653.53411: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204653.53424: Calling all_plugins_play to load vars for managed-node2 40074 1727204653.53428: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204653.53431: Calling groups_plugins_play to load vars for managed-node2 40074 1727204653.54803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204653.56427: done with get_vars() 40074 1727204653.56452: done getting variables 40074 1727204653.56503: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 40074 1727204653.56599: variable 'profile' from source: include params 40074 1727204653.56602: variable 'item' from source: include params 40074 1727204653.56653: variable 'item' from source: include params TASK [Assert that the profile is absent - 'ethtest1'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.047) 0:00:47.328 ***** 40074 1727204653.56679: entering _queue_task() for managed-node2/assert 40074 1727204653.56934: worker is 1 (out of 1 available) 40074 1727204653.56950: exiting _queue_task() for managed-node2/assert 40074 1727204653.56963: done queuing things up, now waiting for results queue to drain 40074 1727204653.56965: waiting for pending results... 40074 1727204653.57160: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'ethtest1' 40074 1727204653.57258: in run() - task 12b410aa-8751-9fd7-2501-000000000a72 40074 1727204653.57270: variable 'ansible_search_path' from source: unknown 40074 1727204653.57273: variable 'ansible_search_path' from source: unknown 40074 1727204653.57310: calling self._execute() 40074 1727204653.57398: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204653.57405: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204653.57421: variable 'omit' from source: magic vars 40074 1727204653.57737: variable 'ansible_distribution_major_version' from source: facts 40074 1727204653.57751: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204653.57758: variable 'omit' from source: magic vars 40074 1727204653.57800: variable 'omit' from source: magic vars 40074 1727204653.57980: variable 'profile' from source: include params 40074 1727204653.57988: variable 'item' from source: include params 40074 1727204653.57993: variable 'item' from source: include params 40074 1727204653.58037: variable 'omit' from source: magic vars 40074 1727204653.58058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204653.58202: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204653.58205: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204653.58208: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204653.58211: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204653.58213: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204653.58216: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204653.58221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204653.58330: Set connection var ansible_pipelining to False 40074 1727204653.58338: Set connection var ansible_shell_executable to /bin/sh 40074 1727204653.58341: Set connection var ansible_shell_type to sh 40074 1727204653.58343: Set connection var ansible_connection to ssh 40074 1727204653.58354: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204653.58361: Set connection var ansible_timeout to 10 40074 1727204653.58433: variable 'ansible_shell_executable' from source: unknown 40074 1727204653.58437: variable 'ansible_connection' from source: unknown 40074 1727204653.58440: variable 'ansible_module_compression' from source: unknown 40074 1727204653.58442: variable 'ansible_shell_type' from source: unknown 40074 1727204653.58444: variable 'ansible_shell_executable' from source: unknown 40074 1727204653.58447: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204653.58449: variable 'ansible_pipelining' from source: unknown 40074 1727204653.58451: variable 'ansible_timeout' from source: unknown 40074 1727204653.58453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204653.58588: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204653.58604: variable 'omit' from source: magic vars 40074 1727204653.58611: starting attempt loop 40074 1727204653.58614: running the handler 40074 1727204653.58794: variable 'lsr_net_profile_exists' from source: set_fact 40074 1727204653.58798: Evaluated conditional (not lsr_net_profile_exists): True 40074 1727204653.58800: handler run complete 40074 1727204653.58803: attempt loop complete, returning result 40074 1727204653.58805: _execute() done 40074 1727204653.58808: dumping result to json 40074 1727204653.58811: done dumping result, returning 40074 1727204653.58813: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'ethtest1' [12b410aa-8751-9fd7-2501-000000000a72] 40074 1727204653.58815: sending task result for task 12b410aa-8751-9fd7-2501-000000000a72 40074 1727204653.58931: done sending task result for task 12b410aa-8751-9fd7-2501-000000000a72 40074 1727204653.58933: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 40074 1727204653.59013: no more pending results, returning what we have 40074 1727204653.59019: results queue empty 40074 1727204653.59020: checking for any_errors_fatal 40074 1727204653.59026: done checking for any_errors_fatal 40074 1727204653.59027: checking for max_fail_percentage 40074 1727204653.59028: done checking for max_fail_percentage 40074 1727204653.59029: checking to see if all hosts have failed and the running result is not ok 40074 1727204653.59031: done checking to see if all hosts have failed 40074 1727204653.59031: getting the remaining hosts for this loop 40074 1727204653.59033: done getting the remaining hosts for this loop 40074 1727204653.59037: getting the next task for host managed-node2 40074 1727204653.59045: done getting next task for host managed-node2 40074 1727204653.59048: ^ task is: TASK: Verify network state restored to default 40074 1727204653.59052: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204653.59056: getting variables 40074 1727204653.59058: in VariableManager get_vars() 40074 1727204653.59142: Calling all_inventory to load vars for managed-node2 40074 1727204653.59146: Calling groups_inventory to load vars for managed-node2 40074 1727204653.59150: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204653.59161: Calling all_plugins_play to load vars for managed-node2 40074 1727204653.59165: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204653.59169: Calling groups_plugins_play to load vars for managed-node2 40074 1727204653.60548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204653.62808: done with get_vars() 40074 1727204653.62842: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:169 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.062) 0:00:47.391 ***** 40074 1727204653.62950: entering _queue_task() for managed-node2/include_tasks 40074 1727204653.63284: worker is 1 (out of 1 available) 40074 1727204653.63499: exiting _queue_task() for managed-node2/include_tasks 40074 1727204653.63511: done queuing things up, now waiting for results queue to drain 40074 1727204653.63512: waiting for pending results... 40074 1727204653.63814: running TaskExecutor() for managed-node2/TASK: Verify network state restored to default 40074 1727204653.63824: in run() - task 12b410aa-8751-9fd7-2501-0000000000bb 40074 1727204653.63827: variable 'ansible_search_path' from source: unknown 40074 1727204653.63830: calling self._execute() 40074 1727204653.63920: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204653.63924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204653.63934: variable 'omit' from source: magic vars 40074 1727204653.64386: variable 'ansible_distribution_major_version' from source: facts 40074 1727204653.64406: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204653.64414: _execute() done 40074 1727204653.64422: dumping result to json 40074 1727204653.64425: done dumping result, returning 40074 1727204653.64431: done running TaskExecutor() for managed-node2/TASK: Verify network state restored to default [12b410aa-8751-9fd7-2501-0000000000bb] 40074 1727204653.64438: sending task result for task 12b410aa-8751-9fd7-2501-0000000000bb 40074 1727204653.64682: done sending task result for task 12b410aa-8751-9fd7-2501-0000000000bb 40074 1727204653.64686: WORKER PROCESS EXITING 40074 1727204653.64713: no more pending results, returning what we have 40074 1727204653.64718: in VariableManager get_vars() 40074 1727204653.64760: Calling all_inventory to load vars for managed-node2 40074 1727204653.64764: Calling groups_inventory to load vars for managed-node2 40074 1727204653.64767: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204653.64778: Calling all_plugins_play to load vars for managed-node2 40074 1727204653.64782: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204653.64785: Calling groups_plugins_play to load vars for managed-node2 40074 1727204653.67003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204653.69949: done with get_vars() 40074 1727204653.69986: variable 'ansible_search_path' from source: unknown 40074 1727204653.70007: we have included files to process 40074 1727204653.70008: generating all_blocks data 40074 1727204653.70011: done generating all_blocks data 40074 1727204653.70017: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 40074 1727204653.70019: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 40074 1727204653.70022: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 40074 1727204653.70541: done processing included file 40074 1727204653.70543: iterating over new_blocks loaded from include file 40074 1727204653.70545: in VariableManager get_vars() 40074 1727204653.70570: done with get_vars() 40074 1727204653.70572: filtering new block on tags 40074 1727204653.70619: done filtering new block on tags 40074 1727204653.70622: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 40074 1727204653.70627: extending task lists for all hosts with included blocks 40074 1727204653.73134: done extending task lists 40074 1727204653.73136: done processing included files 40074 1727204653.73137: results queue empty 40074 1727204653.73138: checking for any_errors_fatal 40074 1727204653.73143: done checking for any_errors_fatal 40074 1727204653.73144: checking for max_fail_percentage 40074 1727204653.73145: done checking for max_fail_percentage 40074 1727204653.73146: checking to see if all hosts have failed and the running result is not ok 40074 1727204653.73148: done checking to see if all hosts have failed 40074 1727204653.73149: getting the remaining hosts for this loop 40074 1727204653.73151: done getting the remaining hosts for this loop 40074 1727204653.73154: getting the next task for host managed-node2 40074 1727204653.73159: done getting next task for host managed-node2 40074 1727204653.73161: ^ task is: TASK: Check routes and DNS 40074 1727204653.73165: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204653.73168: getting variables 40074 1727204653.73170: in VariableManager get_vars() 40074 1727204653.73188: Calling all_inventory to load vars for managed-node2 40074 1727204653.73193: Calling groups_inventory to load vars for managed-node2 40074 1727204653.73196: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204653.73203: Calling all_plugins_play to load vars for managed-node2 40074 1727204653.73206: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204653.73210: Calling groups_plugins_play to load vars for managed-node2 40074 1727204653.75278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204653.78258: done with get_vars() 40074 1727204653.78297: done getting variables 40074 1727204653.78354: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 15:04:13 -0400 (0:00:00.154) 0:00:47.545 ***** 40074 1727204653.78393: entering _queue_task() for managed-node2/shell 40074 1727204653.78780: worker is 1 (out of 1 available) 40074 1727204653.78998: exiting _queue_task() for managed-node2/shell 40074 1727204653.79011: done queuing things up, now waiting for results queue to drain 40074 1727204653.79012: waiting for pending results... 40074 1727204653.79145: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 40074 1727204653.79274: in run() - task 12b410aa-8751-9fd7-2501-000000000bb6 40074 1727204653.79392: variable 'ansible_search_path' from source: unknown 40074 1727204653.79404: variable 'ansible_search_path' from source: unknown 40074 1727204653.79409: calling self._execute() 40074 1727204653.79451: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204653.79461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204653.79473: variable 'omit' from source: magic vars 40074 1727204653.79922: variable 'ansible_distribution_major_version' from source: facts 40074 1727204653.79932: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204653.79940: variable 'omit' from source: magic vars 40074 1727204653.80011: variable 'omit' from source: magic vars 40074 1727204653.80069: variable 'omit' from source: magic vars 40074 1727204653.80101: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 40074 1727204653.80143: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 40074 1727204653.80177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 40074 1727204653.80286: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204653.80291: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 40074 1727204653.80294: variable 'inventory_hostname' from source: host vars for 'managed-node2' 40074 1727204653.80297: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204653.80299: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204653.80390: Set connection var ansible_pipelining to False 40074 1727204653.80402: Set connection var ansible_shell_executable to /bin/sh 40074 1727204653.80405: Set connection var ansible_shell_type to sh 40074 1727204653.80408: Set connection var ansible_connection to ssh 40074 1727204653.80419: Set connection var ansible_module_compression to ZIP_DEFLATED 40074 1727204653.80426: Set connection var ansible_timeout to 10 40074 1727204653.80458: variable 'ansible_shell_executable' from source: unknown 40074 1727204653.80461: variable 'ansible_connection' from source: unknown 40074 1727204653.80464: variable 'ansible_module_compression' from source: unknown 40074 1727204653.80507: variable 'ansible_shell_type' from source: unknown 40074 1727204653.80510: variable 'ansible_shell_executable' from source: unknown 40074 1727204653.80512: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204653.80515: variable 'ansible_pipelining' from source: unknown 40074 1727204653.80520: variable 'ansible_timeout' from source: unknown 40074 1727204653.80523: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204653.80658: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204653.80726: variable 'omit' from source: magic vars 40074 1727204653.80730: starting attempt loop 40074 1727204653.80733: running the handler 40074 1727204653.80735: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 40074 1727204653.80738: _low_level_execute_command(): starting 40074 1727204653.80740: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 40074 1727204653.81504: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204653.81586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204653.81612: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204653.81638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204653.81723: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204653.83527: stdout chunk (state=3): >>>/root <<< 40074 1727204653.83854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204653.83857: stdout chunk (state=3): >>><<< 40074 1727204653.83860: stderr chunk (state=3): >>><<< 40074 1727204653.83863: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204653.83866: _low_level_execute_command(): starting 40074 1727204653.83869: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109 `" && echo ansible-tmp-1727204653.8374963-42112-151726495429109="` echo /root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109 `" ) && sleep 0' 40074 1727204653.84488: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204653.84508: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204653.84524: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204653.84559: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204653.84607: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204653.84626: stderr chunk (state=3): >>>debug2: match found <<< 40074 1727204653.84706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204653.84737: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204653.84767: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204653.84787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204653.84873: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204653.94872: stdout chunk (state=3): >>>ansible-tmp-1727204653.8374963-42112-151726495429109=/root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109 <<< 40074 1727204653.94877: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204653.94937: stderr chunk (state=3): >>><<< 40074 1727204653.95059: stdout chunk (state=3): >>><<< 40074 1727204653.95064: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204653.8374963-42112-151726495429109=/root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204653.95067: variable 'ansible_module_compression' from source: unknown 40074 1727204653.95099: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-40074dxmwzyw4/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 40074 1727204653.95147: variable 'ansible_facts' from source: unknown 40074 1727204653.95263: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109/AnsiballZ_command.py 40074 1727204653.95608: Sending initial data 40074 1727204653.95611: Sent initial data (156 bytes) 40074 1727204653.96808: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204653.97009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204653.97030: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204653.97056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204653.97161: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204653.98845: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 40074 1727204653.98873: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 40074 1727204653.98917: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 40074 1727204653.98963: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpxxn6nouf /root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109/AnsiballZ_command.py <<< 40074 1727204653.98974: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109/AnsiballZ_command.py" <<< 40074 1727204653.98987: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-40074dxmwzyw4/tmpxxn6nouf" to remote "/root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109/AnsiballZ_command.py" <<< 40074 1727204654.00070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204654.00174: stderr chunk (state=3): >>><<< 40074 1727204654.00177: stdout chunk (state=3): >>><<< 40074 1727204654.00180: done transferring module to remote 40074 1727204654.00290: _low_level_execute_command(): starting 40074 1727204654.00304: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109/ /root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109/AnsiballZ_command.py && sleep 0' 40074 1727204654.00976: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204654.00996: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204654.01013: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204654.01033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204654.01159: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204654.01184: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204654.01265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204654.03283: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204654.03287: stdout chunk (state=3): >>><<< 40074 1727204654.03495: stderr chunk (state=3): >>><<< 40074 1727204654.03499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204654.03502: _low_level_execute_command(): starting 40074 1727204654.03505: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109/AnsiballZ_command.py && sleep 0' 40074 1727204654.03991: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 40074 1727204654.04006: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204654.04017: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204654.04036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 40074 1727204654.04056: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 40074 1727204654.04065: stderr chunk (state=3): >>>debug2: match not found <<< 40074 1727204654.04076: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204654.04092: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 40074 1727204654.04101: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 40074 1727204654.04109: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 40074 1727204654.04118: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 40074 1727204654.04168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204654.04225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204654.04247: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 40074 1727204654.04284: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204654.04334: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204654.23085: stdout chunk (state=3): >>> <<< 40074 1727204654.23129: stdout chunk (state=3): >>>{"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2849sec preferred_lft 2849sec\n inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n36: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether ee:fa:4b:42:85:80 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:04:14.220470", "end": "2024-09-24 15:04:14.229958", "delta": "0:00:00.009488", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 40074 1727204654.24979: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 40074 1727204654.25043: stderr chunk (state=3): >>><<< 40074 1727204654.25047: stdout chunk (state=3): >>><<< 40074 1727204654.25067: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2849sec preferred_lft 2849sec\n inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\n36: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000\n link/ether ee:fa:4b:42:85:80 brd ff:ff:ff:ff:ff:ff\n inet 192.0.2.72/31 scope global noprefixroute rpltstbr\n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \n192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 15:04:14.220470", "end": "2024-09-24 15:04:14.229958", "delta": "0:00:00.009488", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 40074 1727204654.25122: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 40074 1727204654.25136: _low_level_execute_command(): starting 40074 1727204654.25144: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204653.8374963-42112-151726495429109/ > /dev/null 2>&1 && sleep 0' 40074 1727204654.25593: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204654.25631: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 40074 1727204654.25634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204654.25637: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 40074 1727204654.25639: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 40074 1727204654.25694: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 40074 1727204654.25697: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 40074 1727204654.25749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 40074 1727204654.27734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 40074 1727204654.27785: stderr chunk (state=3): >>><<< 40074 1727204654.27789: stdout chunk (state=3): >>><<< 40074 1727204654.27807: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 40074 1727204654.27815: handler run complete 40074 1727204654.27842: Evaluated conditional (False): False 40074 1727204654.27855: attempt loop complete, returning result 40074 1727204654.27858: _execute() done 40074 1727204654.27862: dumping result to json 40074 1727204654.27870: done dumping result, returning 40074 1727204654.27878: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [12b410aa-8751-9fd7-2501-000000000bb6] 40074 1727204654.27883: sending task result for task 12b410aa-8751-9fd7-2501-000000000bb6 40074 1727204654.28012: done sending task result for task 12b410aa-8751-9fd7-2501-000000000bb6 40074 1727204654.28016: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.009488", "end": "2024-09-24 15:04:14.229958", "rc": 0, "start": "2024-09-24 15:04:14.220470" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2849sec preferred_lft 2849sec inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute valid_lft forever preferred_lft forever 36: rpltstbr: mtu 1500 qdisc noqueue state DOWN group default qlen 1000 link/ether ee:fa:4b:42:85:80 brd ff:ff:ff:ff:ff:ff inet 192.0.2.72/31 scope global noprefixroute rpltstbr valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 192.0.2.72/31 dev rpltstbr proto kernel scope link src 192.0.2.72 metric 425 linkdown IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 40074 1727204654.28115: no more pending results, returning what we have 40074 1727204654.28122: results queue empty 40074 1727204654.28123: checking for any_errors_fatal 40074 1727204654.28125: done checking for any_errors_fatal 40074 1727204654.28125: checking for max_fail_percentage 40074 1727204654.28131: done checking for max_fail_percentage 40074 1727204654.28132: checking to see if all hosts have failed and the running result is not ok 40074 1727204654.28134: done checking to see if all hosts have failed 40074 1727204654.28141: getting the remaining hosts for this loop 40074 1727204654.28143: done getting the remaining hosts for this loop 40074 1727204654.28148: getting the next task for host managed-node2 40074 1727204654.28155: done getting next task for host managed-node2 40074 1727204654.28158: ^ task is: TASK: Verify DNS and network connectivity 40074 1727204654.28162: ^ state is: HOST STATE: block=3, task=11, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 40074 1727204654.28166: getting variables 40074 1727204654.28167: in VariableManager get_vars() 40074 1727204654.28216: Calling all_inventory to load vars for managed-node2 40074 1727204654.28221: Calling groups_inventory to load vars for managed-node2 40074 1727204654.28224: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204654.28239: Calling all_plugins_play to load vars for managed-node2 40074 1727204654.28243: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204654.28247: Calling groups_plugins_play to load vars for managed-node2 40074 1727204654.29644: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204654.31264: done with get_vars() 40074 1727204654.31293: done getting variables 40074 1727204654.31350: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 15:04:14 -0400 (0:00:00.529) 0:00:48.075 ***** 40074 1727204654.31378: entering _queue_task() for managed-node2/shell 40074 1727204654.31672: worker is 1 (out of 1 available) 40074 1727204654.31687: exiting _queue_task() for managed-node2/shell 40074 1727204654.31702: done queuing things up, now waiting for results queue to drain 40074 1727204654.31704: waiting for pending results... 40074 1727204654.31914: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 40074 1727204654.32016: in run() - task 12b410aa-8751-9fd7-2501-000000000bb7 40074 1727204654.32029: variable 'ansible_search_path' from source: unknown 40074 1727204654.32036: variable 'ansible_search_path' from source: unknown 40074 1727204654.32069: calling self._execute() 40074 1727204654.32155: variable 'ansible_host' from source: host vars for 'managed-node2' 40074 1727204654.32165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 40074 1727204654.32175: variable 'omit' from source: magic vars 40074 1727204654.32502: variable 'ansible_distribution_major_version' from source: facts 40074 1727204654.32513: Evaluated conditional (ansible_distribution_major_version != '6'): True 40074 1727204654.32636: variable 'ansible_facts' from source: unknown 40074 1727204654.33360: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 40074 1727204654.33364: when evaluation is False, skipping this task 40074 1727204654.33367: _execute() done 40074 1727204654.33370: dumping result to json 40074 1727204654.33376: done dumping result, returning 40074 1727204654.33382: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [12b410aa-8751-9fd7-2501-000000000bb7] 40074 1727204654.33388: sending task result for task 12b410aa-8751-9fd7-2501-000000000bb7 40074 1727204654.33487: done sending task result for task 12b410aa-8751-9fd7-2501-000000000bb7 40074 1727204654.33491: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 40074 1727204654.33548: no more pending results, returning what we have 40074 1727204654.33553: results queue empty 40074 1727204654.33554: checking for any_errors_fatal 40074 1727204654.33569: done checking for any_errors_fatal 40074 1727204654.33570: checking for max_fail_percentage 40074 1727204654.33572: done checking for max_fail_percentage 40074 1727204654.33573: checking to see if all hosts have failed and the running result is not ok 40074 1727204654.33574: done checking to see if all hosts have failed 40074 1727204654.33575: getting the remaining hosts for this loop 40074 1727204654.33576: done getting the remaining hosts for this loop 40074 1727204654.33581: getting the next task for host managed-node2 40074 1727204654.33594: done getting next task for host managed-node2 40074 1727204654.33597: ^ task is: TASK: meta (flush_handlers) 40074 1727204654.33599: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204654.33605: getting variables 40074 1727204654.33607: in VariableManager get_vars() 40074 1727204654.33654: Calling all_inventory to load vars for managed-node2 40074 1727204654.33657: Calling groups_inventory to load vars for managed-node2 40074 1727204654.33659: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204654.33671: Calling all_plugins_play to load vars for managed-node2 40074 1727204654.33675: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204654.33678: Calling groups_plugins_play to load vars for managed-node2 40074 1727204654.34929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204654.36544: done with get_vars() 40074 1727204654.36566: done getting variables 40074 1727204654.36629: in VariableManager get_vars() 40074 1727204654.36642: Calling all_inventory to load vars for managed-node2 40074 1727204654.36644: Calling groups_inventory to load vars for managed-node2 40074 1727204654.36646: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204654.36650: Calling all_plugins_play to load vars for managed-node2 40074 1727204654.36652: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204654.36654: Calling groups_plugins_play to load vars for managed-node2 40074 1727204654.37834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204654.39428: done with get_vars() 40074 1727204654.39455: done queuing things up, now waiting for results queue to drain 40074 1727204654.39457: results queue empty 40074 1727204654.39458: checking for any_errors_fatal 40074 1727204654.39460: done checking for any_errors_fatal 40074 1727204654.39460: checking for max_fail_percentage 40074 1727204654.39461: done checking for max_fail_percentage 40074 1727204654.39462: checking to see if all hosts have failed and the running result is not ok 40074 1727204654.39463: done checking to see if all hosts have failed 40074 1727204654.39463: getting the remaining hosts for this loop 40074 1727204654.39464: done getting the remaining hosts for this loop 40074 1727204654.39466: getting the next task for host managed-node2 40074 1727204654.39469: done getting next task for host managed-node2 40074 1727204654.39470: ^ task is: TASK: meta (flush_handlers) 40074 1727204654.39472: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204654.39474: getting variables 40074 1727204654.39475: in VariableManager get_vars() 40074 1727204654.39485: Calling all_inventory to load vars for managed-node2 40074 1727204654.39487: Calling groups_inventory to load vars for managed-node2 40074 1727204654.39488: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204654.39495: Calling all_plugins_play to load vars for managed-node2 40074 1727204654.39497: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204654.39499: Calling groups_plugins_play to load vars for managed-node2 40074 1727204654.40637: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204654.42206: done with get_vars() 40074 1727204654.42228: done getting variables 40074 1727204654.42271: in VariableManager get_vars() 40074 1727204654.42282: Calling all_inventory to load vars for managed-node2 40074 1727204654.42284: Calling groups_inventory to load vars for managed-node2 40074 1727204654.42286: Calling all_plugins_inventory to load vars for managed-node2 40074 1727204654.42291: Calling all_plugins_play to load vars for managed-node2 40074 1727204654.42293: Calling groups_plugins_inventory to load vars for managed-node2 40074 1727204654.42295: Calling groups_plugins_play to load vars for managed-node2 40074 1727204654.43379: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 40074 1727204654.44966: done with get_vars() 40074 1727204654.44992: done queuing things up, now waiting for results queue to drain 40074 1727204654.44994: results queue empty 40074 1727204654.44994: checking for any_errors_fatal 40074 1727204654.44995: done checking for any_errors_fatal 40074 1727204654.44996: checking for max_fail_percentage 40074 1727204654.44997: done checking for max_fail_percentage 40074 1727204654.44997: checking to see if all hosts have failed and the running result is not ok 40074 1727204654.44998: done checking to see if all hosts have failed 40074 1727204654.44998: getting the remaining hosts for this loop 40074 1727204654.44999: done getting the remaining hosts for this loop 40074 1727204654.45006: getting the next task for host managed-node2 40074 1727204654.45009: done getting next task for host managed-node2 40074 1727204654.45010: ^ task is: None 40074 1727204654.45011: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 40074 1727204654.45012: done queuing things up, now waiting for results queue to drain 40074 1727204654.45013: results queue empty 40074 1727204654.45013: checking for any_errors_fatal 40074 1727204654.45014: done checking for any_errors_fatal 40074 1727204654.45014: checking for max_fail_percentage 40074 1727204654.45015: done checking for max_fail_percentage 40074 1727204654.45015: checking to see if all hosts have failed and the running result is not ok 40074 1727204654.45016: done checking to see if all hosts have failed 40074 1727204654.45020: getting the next task for host managed-node2 40074 1727204654.45022: done getting next task for host managed-node2 40074 1727204654.45022: ^ task is: None 40074 1727204654.45023: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=107 changed=3 unreachable=0 failed=0 skipped=88 rescued=0 ignored=2 Tuesday 24 September 2024 15:04:14 -0400 (0:00:00.137) 0:00:48.212 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 3.38s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.44s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.37s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Install iproute --------------------------------------------------------- 1.97s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Install iproute --------------------------------------------------------- 1.82s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 1.53s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.45s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_route_device_nm.yml:6 Gathering Facts --------------------------------------------------------- 1.29s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_route_device.yml:3 Create veth interface ethtest1 ------------------------------------------ 1.28s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Create veth interface ethtest0 ------------------------------------------ 1.18s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.11s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Check which packages are installed --- 1.10s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.09s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.06s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.88s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Gather the minimum subset of ansible_facts required by the network role test --- 0.83s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Check if system is ostree ----------------------------------------------- 0.76s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.73s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.69s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.65s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 40074 1727204654.45147: RUNNING CLEANUP